Tutorial 4 - Li et al., 2022
1.4 Model the Data Generation Process
First let's define the functions
function d_Inequality = inequality(a1, b1, a2, b2)
d_Inequality = abs(a2 - b2) - abs(a1 - b1);
function d_lossAdvantaged = harm(a0, b0, a1, b1, a2, b2)
[~, advantaged] = max(initial);
d_lossAdvantaged = (initial(advantaged) - choice1(advantaged)) - (initial(advantaged) - choice2(advantaged));
function rankReverseDiff = rankReverse(a0, b0, a1, b1, a2, b2)
if d_choice1 < 0, choice1Reversed = 1; end
if d_choice2 < 0, choice2Reversed = 1; end
if d_choice1 > 0, choice1Reversed = 1; end
if d_choice2 > 0, choice2Reversed = 1; end
rankReverseDiff = choice1Reversed - choice2Reversed;
Now let's check and see if they do what we want. Let's make an example trial:
example = table(20, 0, 8, 12, 15, 5, 'VariableNames', ["a0", "b0", "a1", "b1", "a2", "b2"])
Now get the outputs based on this example
inequality(example.a1(1), example.b1(1), example.a2(1), example.b2(1))
harm(example.a0(1), example.b0(1), example.a1(1), example.b1(1), example.a2(1), example.b2(1))
rankReverse(example.a0(1), example.b0(1), example.a1(1), example.b1(1), example.a2(1), example.b2(1))
1.5 Simulating Data
Now let's preallocate and define functions, triallist, and parameters
a0 = randi([10,20], 100, 1);
trialList = table(a0, b0);
for i = 1:length(trialList.a0)
trialList.a1(i) = randi([5, trialList.a0(i)]);
trialList.b1 = 20 - trialList.a1;
for i = 1:length(trialList.a0)
trialList.a2(i) = randi([5, trialList.a0(i)]);
if trialList.a2(i) == trialList.a1(i)
trialList.b2 = 20 - trialList.a2;
trialList = [trialList; trialList(:, [2, 1, 4, 3, 6, 5])]
trialList = 200×6 table
| | a0 | b0 | a1 | b1 | a2 | b2 |
|---|
| 1 | 18 | 2 | 7 | 13 | 14 | 6 |
|---|
| 2 | 19 | 1 | 16 | 4 | 10 | 10 |
|---|
| 3 | 11 | 9 | 7 | 13 | 10 | 10 |
|---|
| 4 | 20 | 0 | 13 | 7 | 10 | 10 |
|---|
| 5 | 16 | 4 | 6 | 14 | 9 | 11 |
|---|
| 6 | 11 | 9 | 9 | 11 | 11 | 9 |
|---|
| 7 | 13 | 7 | 7 | 13 | 12 | 8 |
|---|
| 8 | 16 | 4 | 12 | 8 | 11 | 9 |
|---|
| 9 | 20 | 0 | 16 | 4 | 14 | 6 |
|---|
| 10 | 20 | 0 | 16 | 4 | 14 | 6 |
|---|
| 11 | 11 | 9 | 8 | 12 | 6 | 14 |
|---|
| 12 | 20 | 0 | 6 | 14 | 9 | 11 |
|---|
| 13 | 20 | 0 | 8 | 12 | 12 | 8 |
|---|
| 14 | 15 | 5 | 15 | 5 | 7 | 13 |
|---|
| ⋮ |
|---|
function util = utility(pars, IVs)
a0 = IVs(1); b0 = IVs(2); a1 = IVs(3); b1 = IVs(4); a2 = IVs(5); b2 = IVs(6);
alpha = pars(1); delta = pars(2); rho = pars(3);
util = (alpha * inequality(a1, b1, a2, b2)) - (delta * harm(a0, b0, a1, b1, a2, b2)) - (rho * rankReverse(a0, b0, a1, b1, a2, b2));
function prob = probability(pars, utilitydiff)
beta = pars(end-2); epsilon = pars(end-1); gamma = pars(end);
prob = 1 / (1 + exp(-(beta * utilitydiff)));
prob = prob * (1 - 2 * epsilon) + epsilon + gamma * (2 * epsilon);
prob = max(min(prob, 0.9999999999), 0.00000000001);
freeParameters = struct();
for i = 1:length(mainpars)
for j = 1:length(mainpars)
for k = 1:length(mainpars)
freeParameters(i, j, k, l, m, n).alpha = mainpars(i) + rand(1,1)*0.5;
freeParameters(i, j, k, l, m, n).delta = mainpars(j) + rand(1,1)*0.5;
freeParameters(i, j, k, l, m, n).rho = mainpars(k) + rand(1,1)*0.5;
freeParameters(i, j, k, l, m, n).beta = bet(l).*rand(1,1)*5;
freeParameters(i, j, k, l, m, n).epsilon = eps(m);
freeParameters(i, j, k, l, m, n).gamma = gam(n) + rand(1, 1)*0.5;
function pred = generatePredictions(parameters, df)
pred = zeros(size(df, 1), 1);
thisTrialIVs = table2array(df(i, :));
utilityDiff = utility(parameters, thisTrialIVs);
pred(i) = probability(parameters, utilityDiff);
Now that all of that's done, let's generate predictions
for i = 1:length(mainpars)
for j = 1:length(mainpars)
for k = 1:length(mainpars)
pars = [freeParameters(i, j, k, l, m, n).alpha, freeParameters(i, j, k, l, m, n).delta, freeParameters(i, j, k, l, m, n).rho, freeParameters(i, j, k, l, m, n).beta, freeParameters(i, j, k, l, m, n).epsilon, freeParameters(i, j, k, l, m, n).gamma];
freeParameters(i, j, k, l, m, n).predictions = generatePredictions(pars, trialList);
1.6 Compare Recovered Parameters
Let's write the objective function
function obj_val = obj_function(params, df, optimMethod)
Prob1 = generatePredictions(params, df);
Chose1 = table2array(df(:, 7));
if strcmp(optimMethod, 'OLS')
obj_val = sum((Chose1 - Prob1) .^ 2);
obj_val = -sum(Chose1 .* log(Prob1) + (1 - Chose1) .* log(Prob1));
Now we can set up the optimizer (we'll suppress the output of successful completion to avoid inundating the output)
initial_params = [1, 1, 1, 4, 0.25, 0];
lower_bounds = [0, 0, 0, 0, 0, -0.5];
upper_bounds = [2, 2, 2, 10, 0.5, 0.5];
function result = optimize(obj, initial_params, lower_bounds, upper_bounds, df)
result = fmincon(@(x) obj(x, df, 'OLS'), initial_params, [], [], [], [], lower_bounds, upper_bounds, [], optimoptions('fmincon', 'Display', 'off'));
result = fmincon(@(x) obj(x, df, 'MLE'), initial_params, [], [], [], [], lower_bounds, upper_bounds, [], optimoptions('fmincon', 'Display', 'off'));
And this lets us recover the free parameters
for i = 1:length(mainpars)
for j = 1:length(mainpars)
for k = 1:length(mainpars)
trialList.Chose1(:) = round(freeParameters(i, j, k, l, m, n).predictions .* 1000) > randi([1, 1000], length(freeParameters(i, j, k, l, m, n).predictions), 1);
result = optimize(@obj_function, initial_params, lower_bounds, upper_bounds, trialList);
freeParameters(i, j, k, l, m, n).alphaRecovered = result(1);
freeParameters(i, j, k, l, m, n).deltaRecovered = result(2);
freeParameters(i, j, k, l, m, n).rhoRecovered = result(3);
freeParameters(i, j, k, l, m, n).betaRecovered = result(4);
freeParameters(i, j, k, l, m, n).epsilonRecovered = result(5);
freeParameters(i, j, k, l, m, n).gammaRecovered = result(6);
Let's extract values to format in a plottable way
totalIterations = length(mainpars)^3 * length(bet) * length(eps) * length(gam);
parsPlot.alpha = NaN(totalIterations, 1);
parsPlot.alphaRecovered = NaN(totalIterations, 1);
parsPlot.delta = NaN(totalIterations, 1);
parsPlot.deltaRecovered = NaN(totalIterations, 1);
parsPlot.rho = NaN(totalIterations, 1);
parsPlot.rhoRecovered = NaN(totalIterations, 1);
parsPlot.beta = NaN(totalIterations, 1);
parsPlot.betaRecovered = NaN(totalIterations, 1);
parsPlot.epsilon = NaN(totalIterations, 1);
parsPlot.epsilonRecovered = NaN(totalIterations, 1);
parsPlot.gamma = NaN(totalIterations, 1);
parsPlot.gammaRecovered = NaN(totalIterations, 1);
for i = 1:length(mainpars)
for j = 1:length(mainpars)
for k = 1:length(mainpars)
parsPlot.alpha(end+1) = freeParameters(i, j, k, l, m, n).alpha;
parsPlot.alphaRecovered(end+1) = freeParameters(i, j, k, l, m, n).alphaRecovered;
parsPlot.delta(end+1) = freeParameters(i, j, k, l, m, n).delta;
parsPlot.deltaRecovered(end+1) = freeParameters(i, j, k, l, m, n).deltaRecovered;
parsPlot.rho(end+1) = freeParameters(i, j, k, l, m, n).rho;
parsPlot.rhoRecovered(end+1) = freeParameters(i, j, k, l, m, n).rhoRecovered;
parsPlot.beta(end+1) = freeParameters(i, j, k, l, m, n).beta;
parsPlot.betaRecovered(end+1) = freeParameters(i, j, k, l, m, n).betaRecovered;
parsPlot.epsilon(end+1) = freeParameters(i, j, k, l, m, n).epsilon;
parsPlot.epsilonRecovered(end+1) = freeParameters(i, j, k, l, m, n).epsilonRecovered;
parsPlot.gamma(end+1) = freeParameters(i, j, k, l, m, n).gamma;
parsPlot.gammaRecovered(end+1) = freeParameters(i, j, k, l, m, n).gammaRecovered;
So we can now verify that we can reliably recover them using plots
parsPlot.Epsilon = categorical(parsPlot.epsilon);
scatter(parsPlot.alpha, parsPlot.alphaRecovered, [], parsPlot.Epsilon);
hold on; lsline; plot(xlim, ylim, '--k'); hold off;
title('Alpha'); xlabel('Actual'); ylabel('Recovered');
scatter(parsPlot.delta, parsPlot.deltaRecovered, [], parsPlot.Epsilon);
hold on; lsline; plot(xlim, ylim, '--k'); hold off;
title('Delta'); xlabel('Actual'); ylabel('Recovered');
scatter(parsPlot.rho, parsPlot.rhoRecovered, [], parsPlot.Epsilon);
hold on; lsline; plot(xlim, ylim, '--k'); hold off;
title('Rho'); xlabel('Actual'); ylabel('Recovered');
scatter(parsPlot.beta, parsPlot.betaRecovered, [], parsPlot.Epsilon);
hold on; lsline; plot(xlim, ylim, '--k'); hold off;
title('Beta'); xlabel('Actual'); ylabel('Recovered');
scatter(parsPlot.epsilon, parsPlot.epsilonRecovered);
hold on; lsline; plot(xlim, ylim, '--k'); hold off;
title('Epsilon'); xlabel('Actual'); ylabel('Recovered');
scatter(parsPlot.gamma, parsPlot.gammaRecovered, [], parsPlot.Epsilon);
hold on; plot(xlim, ylim, '--k'); hold off;
title('Gamma'); xlabel('Actual'); ylabel('Recovered');
These are a bit more difficult to visually interpret, but nothing looks particularly accurate. Let's take a closer look at situations where we should see good accuracy.
scatter(parsPlot.alpha(parsPlot.epsilon == 0 & parsPlot.beta > 5), parsPlot.alphaRecovered(parsPlot.epsilon == 0 & parsPlot.beta > 5));
hold on; plot(xlim, ylim, '--k'); hold off;
title('Alpha'); lsline; xlabel('Actual'); ylabel('Recovered');
scatter(parsPlot.delta(parsPlot.epsilon == 0 & parsPlot.beta > 5), parsPlot.deltaRecovered(parsPlot.epsilon == 0 & parsPlot.beta > 5));
hold on; plot(xlim, ylim, '--k'); hold off;
title('Delta'); lsline; xlabel('Actual'); ylabel('Recovered');
scatter(parsPlot.rho(parsPlot.epsilon == 0 & parsPlot.beta > 5), parsPlot.rhoRecovered(parsPlot.epsilon == 0 & parsPlot.beta > 5));
hold on; plot(xlim, ylim, '--k'); hold off;
title('Rho'); lsline; xlabel('Actual'); ylabel('Recovered');
scatter(parsPlot.beta(parsPlot.epsilon == 0 & parsPlot.beta <=10), parsPlot.betaRecovered(parsPlot.epsilon == 0 & parsPlot.beta <=10));
hold on; plot(xlim, ylim, '--k'); hold off;
title('Beta'); lsline; xlabel('Actual'); ylabel('Recovered');
scatter(parsPlot.epsilon(parsPlot.beta > 5), parsPlot.epsilonRecovered(parsPlot.beta > 5));
hold on; plot(xlim, ylim, '--k'); hold off;
title('Epsilon'); lsline; xlabel('Actual'); ylabel('Recovered');
scatter(parsPlot.gamma(parsPlot.epsilon == 0.5 & parsPlot.beta > 5), parsPlot.gammaRecovered(parsPlot.epsilon == 0.5 & parsPlot.beta > 5));
hold on; plot(xlim, ylim, '--k'); hold off;
title('Gamma'); lsline; xlabel('Actual'); ylabel('Recovered');
There does seem to be a positive correlation between these parameters, but it's not particularly reliable. Beta Epsilon, and Gamma are the strongest, Alpha and Delta are pretty poor and Rho seems to be pretty random
2.1 Recovering Free Parameters
First let’s get the trial data from participants
trialData = readtable("C:/Users/DELL/Downloads/Data/Data/HPP_fMRI_beh_data_for_lmm.csv")
trialData = 9104×51 table
| | subject | trial_num | run | partnerA | partnerB | initial_A | initial_B | Pie_initial_high | Pie_initial_low | diff_initial | transfer_size_1 | fairness_1 | A_final_1 | B_final_1 | Pie_Alter1_ini_high | Pie_Alter1_ini_low | diff_final_1 | reverse_1 | transfer_size_2 | fairness_2 | A_final_2 | B_final_2 | Pie_Alter2_ini_high | Pie_Alter2_ini_low | Equal_alter_pie_ini_high | Equal_alter_pie_ini_low | Unqual_alter_pie_ini_high | Unequal_alter_pie_ini_low | diff_final_2 | reverse_2 | transfer_diff | fairness_diff | trail_type | order_option | order_orientation | HPP_key | reaction | If_Rev | If_Rev_1 | Revsize1 | Revsize2 | TransferSize1_a | TransferSize2_a | TransferSize1_b | TransferSize2_b | Mah | Mal | Mbh | Mbl | Trans1_Diff | Trans2_Diff |
|---|
| 1 | 101 | 1 | 1 | 46 | 43 | 15 | 2 | 15 | 2 | 13 | 5 | 10 | 10 | 7 | 10 | 7 | 3 | 0 | 1 | 2 | 14 | 3 | 14 | 3 | 10 | 7 | 14 | 3 | 11 | 0 | 4 | 8 | 1 | 2 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 1 | 0 | 10 | 7 | 14 | 3 | 4 | 0 |
|---|
| 2 | 101 | 2 | 1 | 82 | 1 | 16 | 5 | 16 | 5 | 11 | 2 | 4 | 14 | 7 | 14 | 7 | 7 | 0 | 8 | 6 | 8 | 13 | 8 | 13 | 8 | 13 | 14 | 7 | 5 | 1 | 6 | 2 | 3 | 1 | 1 | 2 | 0 | 1 | 1 | 0 | 5 | 2 | 0 | 5.5000 | 2.5000 | 14 | 7 | 8 | 13 | 3.5000 | 2.5000 |
|---|
| 3 | 101 | 3 | 1 | 2 | 3 | 16 | 5 | 16 | 5 | 11 | 2 | 4 | 14 | 7 | 14 | 7 | 7 | 0 | 3 | 6 | 13 | 8 | 13 | 8 | 13 | 8 | 14 | 7 | 5 | 0 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 3 | 0 | 14 | 7 | 13 | 8 | 1 | 0 |
|---|
| 4 | 101 | 4 | 1 | 42 | 47 | 15 | 3 | 15 | 3 | 12 | 5 | 10 | 10 | 8 | 10 | 8 | 2 | 0 | 1 | 2 | 14 | 4 | 14 | 4 | 10 | 8 | 14 | 4 | 10 | 0 | 4 | 8 | 1 | 2 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 1 | 0 | 10 | 8 | 14 | 4 | 4 | 0 |
|---|
| 5 | 101 | 5 | 1 | 62 | 67 | 5 | 15 | 15 | 5 | 10 | 7 | 6 | 12 | 8 | 8 | 12 | 4 | 1 | 5 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 8 | 12 | 0 | 1 | -2 | 4 | 5 | 1 | 2 | 2 | 0 | 1 | 1 | 4 | 0 | 5 | 2 | 5 | 0 | 8 | 12 | 10 | 10 | 0 | 2 |
|---|
| 6 | 101 | 6 | 1 | 60 | 23 | 1 | 15 | 15 | 1 | 14 | 3 | 6 | 4 | 12 | 12 | 4 | 8 | 0 | 1 | 2 | 2 | 14 | 14 | 2 | 12 | 4 | 14 | 2 | 12 | 0 | 2 | 4 | 1 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 1 | 0 | 12 | 4 | 14 | 2 | 2 | 0 |
|---|
| 7 | 101 | 7 | 1 | 44 | 47 | 16 | 1 | 16 | 1 | 15 | 11 | 8 | 5 | 12 | 5 | 12 | 7 | 1 | 1 | 2 | 15 | 2 | 15 | 2 | 5 | 12 | 15 | 2 | 13 | 0 | 10 | 6 | 3 | 2 | 1 | 1 | 0 | 1 | 1 | 7 | 0 | 7.5000 | 3.5000 | 1 | 0 | 5 | 12 | 15 | 2 | 6.5000 | 3.5000 |
|---|
| 8 | 101 | 8 | 1 | 24 | 29 | 2 | 16 | 16 | 2 | 14 | 3 | 6 | 5 | 13 | 13 | 5 | 8 | 0 | 5 | 10 | 7 | 11 | 11 | 7 | 11 | 7 | 13 | 5 | 4 | 0 | 2 | 4 | 1 | 1 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 5 | 0 | 13 | 5 | 11 | 7 | 2 | 0 |
|---|
| 9 | 101 | 9 | 1 | 20 | 45 | 16 | 3 | 16 | 3 | 13 | 1 | 2 | 15 | 4 | 15 | 4 | 11 | 0 | 11 | 4 | 5 | 14 | 5 | 14 | 5 | 14 | 15 | 4 | 9 | 1 | 10 | 2 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 9 | 1 | 0 | 6.5000 | 4.5000 | 15 | 4 | 5 | 14 | 5.5000 | 4.5000 |
|---|
| 10 | 101 | 10 | 1 | 16 | 53 | 15 | 2 | 15 | 2 | 13 | 6 | 12 | 9 | 8 | 9 | 8 | 1 | 0 | 2 | 4 | 13 | 4 | 13 | 4 | 9 | 8 | 13 | 4 | 9 | 0 | 4 | 8 | 1 | 2 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 2 | 0 | 9 | 8 | 13 | 4 | 4 | 0 |
|---|
| 11 | 101 | 11 | 1 | 20 | 45 | 5 | 15 | 15 | 5 | 10 | 3 | 6 | 8 | 12 | 12 | 8 | 4 | 0 | 5 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 12 | 8 | 0 | 1 | 2 | 4 | 4 | 1 | 2 | 2 | 0 | 1 | 1 | 0 | 0 | 3 | 0 | 5 | 0 | 12 | 8 | 10 | 10 | 2 | 0 |
|---|
| 12 | 101 | 12 | 1 | 32 | 21 | 1 | 16 | 16 | 1 | 15 | 5 | 10 | 6 | 11 | 11 | 6 | 5 | 0 | 7 | 14 | 8 | 9 | 9 | 8 | 9 | 8 | 11 | 6 | 1 | 0 | 2 | 4 | 1 | 1 | 2 | 1 | 1 | 0 | 0 | 0 | 0 | 5 | 0 | 7 | 0 | 11 | 6 | 9 | 8 | 2 | 0 |
|---|
| 13 | 101 | 13 | 1 | 22 | 23 | 5 | 16 | 16 | 5 | 11 | 6 | 10 | 11 | 10 | 10 | 11 | 1 | 1 | 2 | 4 | 7 | 14 | 14 | 7 | 10 | 11 | 14 | 7 | 7 | 0 | 4 | 6 | 3 | 2 | 2 | 1 | 0 | 1 | 1 | 1 | 0 | 5.5000 | 0.5000 | 2 | 0 | 10 | 11 | 14 | 7 | 3.5000 | 0.5000 |
|---|
| 14 | 101 | 14 | 1 | 70 | 69 | 2 | 16 | 16 | 2 | 14 | 2 | 4 | 4 | 14 | 14 | 4 | 10 | 0 | 4 | 8 | 6 | 12 | 12 | 6 | 12 | 6 | 14 | 4 | 6 | 0 | 2 | 4 | 1 | 1 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 4 | 0 | 14 | 4 | 12 | 6 | 2 | 0 |
|---|
| 15 | 101 | 15 | 1 | 18 | 23 | 15 | 2 | 15 | 2 | 13 | 7 | 12 | 8 | 9 | 8 | 9 | 1 | 1 | 2 | 4 | 13 | 4 | 13 | 4 | 8 | 9 | 13 | 4 | 9 | 0 | 5 | 8 | 3 | 2 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 6.5000 | 0.5000 | 2 | 0 | 8 | 9 | 13 | 4 | 4.5000 | 0.5000 |
|---|
| 16 | 101 | 16 | 1 | 56 | 29 | 3 | 16 | 16 | 3 | 13 | 4 | 8 | 7 | 12 | 12 | 7 | 5 | 0 | 1 | 2 | 4 | 15 | 15 | 4 | 12 | 7 | 15 | 4 | 11 | 0 | 3 | 6 | 1 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 1 | 0 | 12 | 7 | 15 | 4 | 3 | 0 |
|---|
| 17 | 101 | 17 | 1 | 34 | 39 | 5 | 16 | 16 | 5 | 11 | 2 | 4 | 7 | 14 | 14 | 7 | 7 | 0 | 7 | 8 | 12 | 9 | 9 | 12 | 9 | 12 | 14 | 7 | 3 | 1 | 5 | 4 | 3 | 1 | 2 | 2 | 0 | 1 | 1 | 0 | 0 | 2 | 0 | 5.5000 | 1.5000 | 14 | 7 | 9 | 12 | 3.5000 | 1.5000 |
|---|
| 18 | 101 | 18 | 1 | 38 | 9 | 1 | 15 | 15 | 1 | 14 | 11 | 6 | 12 | 4 | 4 | 12 | 8 | 1 | 1 | 2 | 2 | 14 | 14 | 2 | 4 | 12 | 14 | 2 | 12 | 0 | 10 | 4 | 3 | 2 | 2 | 1 | 0 | 1 | 1 | 8 | 0 | 7 | 4 | 1 | 0 | 4 | 12 | 14 | 2 | 6 | 4 |
|---|
| 19 | 101 | 19 | 1 | 62 | 21 | 2 | 15 | 15 | 2 | 13 | 8 | 10 | 10 | 7 | 7 | 10 | 3 | 1 | 2 | 4 | 4 | 13 | 13 | 4 | 7 | 10 | 13 | 4 | 9 | 0 | 6 | 6 | 3 | 2 | 2 | 1 | 0 | 1 | 1 | 3 | 0 | 6.5000 | 1.5000 | 2 | 0 | 7 | 10 | 13 | 4 | 4.5000 | 1.5000 |
|---|
| 20 | 101 | 20 | 1 | 82 | 1 | 3 | 15 | 15 | 3 | 12 | 6 | 12 | 9 | 9 | 9 | 9 | 0 | 1 | 9 | 6 | 12 | 6 | 6 | 12 | 9 | 9 | 6 | 12 | 6 | 1 | -3 | 6 | 5 | 2 | 2 | 1 | 0 | 1 | 1 | 0 | 6 | 6 | 0 | 6 | 3 | 9 | 9 | 6 | 12 | 0 | 3 |
|---|
| 21 | 101 | 21 | 1 | 8 | 69 | 2 | 14 | 14 | 2 | 12 | 7 | 10 | 9 | 7 | 7 | 9 | 2 | 1 | 6 | 12 | 8 | 8 | 8 | 8 | 8 | 8 | 7 | 9 | 0 | 1 | -1 | 2 | 5 | 1 | 2 | 2 | 0 | 1 | 1 | 2 | 0 | 6 | 1 | 6 | 0 | 7 | 9 | 8 | 8 | 0 | 1 |
|---|
| 22 | 101 | 22 | 1 | 30 | 31 | 3 | 16 | 16 | 3 | 13 | 2 | 4 | 5 | 14 | 14 | 5 | 9 | 0 | 4 | 8 | 7 | 12 | 12 | 7 | 12 | 7 | 14 | 5 | 5 | 0 | 2 | 4 | 1 | 1 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 4 | 0 | 14 | 5 | 12 | 7 | 2 | 0 |
|---|
| 23 | 101 | 23 | 1 | 40 | 5 | 15 | 3 | 15 | 3 | 12 | 7 | 10 | 8 | 10 | 8 | 10 | 2 | 1 | 1 | 2 | 14 | 4 | 14 | 4 | 8 | 10 | 14 | 4 | 10 | 0 | 6 | 8 | 3 | 2 | 1 | 1 | 0 | 1 | 1 | 2 | 0 | 6 | 1 | 1 | 0 | 8 | 10 | 14 | 4 | 5 | 1 |
|---|
| 24 | 101 | 24 | 1 | 52 | 35 | 3 | 15 | 15 | 3 | 12 | 7 | 10 | 10 | 8 | 8 | 10 | 2 | 1 | 2 | 4 | 5 | 13 | 13 | 5 | 8 | 10 | 13 | 5 | 8 | 0 | 5 | 6 | 3 | 2 | 2 | 1 | 0 | 1 | 1 | 2 | 0 | 6 | 1 | 2 | 0 | 8 | 10 | 13 | 5 | 4 | 1 |
|---|
| 25 | 101 | 25 | 1 | 66 | 65 | 2 | 15 | 15 | 2 | 13 | 5 | 10 | 7 | 10 | 10 | 7 | 3 | 0 | 2 | 4 | 4 | 13 | 13 | 4 | 10 | 7 | 13 | 4 | 9 | 0 | 3 | 6 | 1 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 2 | 0 | 10 | 7 | 13 | 4 | 3 | 0 |
|---|
| 26 | 101 | 26 | 1 | 50 | 37 | 15 | 1 | 15 | 1 | 14 | 9 | 10 | 6 | 10 | 6 | 10 | 4 | 1 | 1 | 2 | 14 | 2 | 14 | 2 | 6 | 10 | 14 | 2 | 12 | 0 | 8 | 8 | 3 | 2 | 1 | 1 | 0 | 1 | 1 | 4 | 0 | 7 | 2 | 1 | 0 | 6 | 10 | 14 | 2 | 6 | 2 |
|---|
| 27 | 101 | 27 | 1 | 50 | 37 | 16 | 3 | 16 | 3 | 13 | 1 | 2 | 15 | 4 | 15 | 4 | 11 | 0 | 2 | 4 | 14 | 5 | 14 | 5 | 14 | 5 | 15 | 4 | 9 | 0 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 15 | 4 | 14 | 5 | 1 | 0 |
|---|
| 28 | 101 | 28 | 1 | 38 | 9 | 3 | 15 | 15 | 3 | 12 | 8 | 8 | 11 | 7 | 7 | 11 | 4 | 1 | 1 | 2 | 4 | 14 | 14 | 4 | 7 | 11 | 14 | 4 | 10 | 0 | 7 | 6 | 3 | 2 | 2 | 2 | 1 | 1 | 1 | 4 | 0 | 6 | 2 | 1 | 0 | 7 | 11 | 14 | 4 | 5 | 2 |
|---|
| 29 | 101 | 29 | 1 | 46 | 47 | 3 | 15 | 15 | 3 | 12 | 5 | 10 | 8 | 10 | 10 | 8 | 2 | 0 | 2 | 4 | 5 | 13 | 13 | 5 | 10 | 8 | 13 | 5 | 8 | 0 | 3 | 6 | 1 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 2 | 0 | 10 | 8 | 13 | 5 | 3 | 0 |
|---|
| 30 | 101 | 30 | 1 | 24 | 37 | 16 | 2 | 16 | 2 | 14 | 6 | 12 | 10 | 8 | 10 | 8 | 2 | 0 | 7 | 14 | 9 | 9 | 9 | 9 | 9 | 9 | 10 | 8 | 0 | 1 | 1 | 2 | 4 | 1 | 1 | 2 | 0 | 1 | 1 | 0 | 0 | 6 | 0 | 7 | 0 | 10 | 8 | 9 | 9 | 1 | 0 |
|---|
| 31 | 101 | 31 | 1 | 30 | 25 | 2 | 14 | 14 | 2 | 12 | 5 | 10 | 7 | 9 | 9 | 7 | 2 | 0 | 6 | 12 | 8 | 8 | 8 | 8 | 8 | 8 | 9 | 7 | 0 | 1 | 1 | 2 | 4 | 1 | 2 | 1 | 1 | 1 | 1 | 0 | 0 | 5 | 0 | 6 | 0 | 9 | 7 | 8 | 8 | 1 | 0 |
|---|
| 32 | 101 | 32 | 1 | 72 | 71 | 15 | 3 | 15 | 3 | 12 | 4 | 8 | 11 | 7 | 11 | 7 | 4 | 0 | 7 | 10 | 8 | 10 | 8 | 10 | 8 | 10 | 11 | 7 | 2 | 1 | 3 | 2 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 2 | 4 | 0 | 6 | 1 | 11 | 7 | 8 | 10 | 2 | 1 |
|---|
| 33 | 101 | 33 | 1 | 54 | 33 | 15 | 1 | 15 | 1 | 14 | 5 | 10 | 10 | 6 | 10 | 6 | 4 | 0 | 1 | 2 | 14 | 2 | 14 | 2 | 10 | 6 | 14 | 2 | 12 | 0 | 4 | 8 | 1 | 2 | 1 | 2 | 1 | 0 | 0 | 0 | 0 | 5 | 0 | 1 | 0 | 10 | 6 | 14 | 2 | 4 | 0 |
|---|
| 34 | 101 | 34 | 1 | 40 | 45 | 14 | 2 | 14 | 2 | 12 | 6 | 12 | 8 | 8 | 8 | 8 | 0 | 1 | 9 | 6 | 5 | 11 | 5 | 11 | 8 | 8 | 5 | 11 | 6 | 1 | -3 | 6 | 5 | 2 | 1 | 2 | 1 | 1 | 1 | 0 | 6 | 6 | 0 | 6 | 3 | 8 | 8 | 5 | 11 | 0 | 3 |
|---|
| 35 | 101 | 35 | 1 | 6 | 73 | 5 | 16 | 16 | 5 | 11 | 5 | 10 | 10 | 11 | 11 | 10 | 1 | 0 | 2 | 4 | 7 | 14 | 14 | 7 | 11 | 10 | 14 | 7 | 7 | 0 | 3 | 6 | 1 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 2 | 0 | 11 | 10 | 14 | 7 | 3 | 0 |
|---|
| 36 | 101 | 36 | 1 | 32 | 37 | 14 | 2 | 14 | 2 | 12 | 1 | 2 | 13 | 3 | 13 | 3 | 10 | 0 | 10 | 4 | 4 | 12 | 4 | 12 | 4 | 12 | 13 | 3 | 8 | 1 | 9 | 2 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 8 | 1 | 0 | 6 | 4 | 13 | 3 | 4 | 12 | 5 | 4 |
|---|
| 37 | 101 | 37 | 1 | 22 | 41 | 15 | 2 | 15 | 2 | 13 | 8 | 10 | 7 | 10 | 7 | 10 | 3 | 1 | 1 | 2 | 14 | 3 | 14 | 3 | 7 | 10 | 14 | 3 | 11 | 0 | 7 | 8 | 3 | 2 | 1 | 1 | 0 | 1 | 1 | 3 | 0 | 6.5000 | 1.5000 | 1 | 0 | 7 | 10 | 14 | 3 | 5.5000 | 1.5000 |
|---|
| 38 | 101 | 38 | 1 | 54 | 33 | 14 | 2 | 14 | 2 | 12 | 1 | 2 | 13 | 3 | 13 | 3 | 10 | 0 | 2 | 4 | 12 | 4 | 12 | 4 | 12 | 4 | 13 | 3 | 8 | 0 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 13 | 3 | 12 | 4 | 1 | 0 |
|---|
| 39 | 101 | 39 | 1 | 20 | 21 | 3 | 16 | 16 | 3 | 13 | 9 | 8 | 12 | 7 | 7 | 12 | 5 | 1 | 1 | 2 | 4 | 15 | 15 | 4 | 7 | 12 | 15 | 4 | 11 | 0 | 8 | 6 | 3 | 2 | 2 | 1 | 0 | 1 | 1 | 5 | 0 | 6.5000 | 2.5000 | 1 | 0 | 7 | 12 | 15 | 4 | 5.5000 | 2.5000 |
|---|
| 40 | 101 | 40 | 1 | 12 | 61 | 15 | 2 | 15 | 2 | 13 | 4 | 8 | 11 | 6 | 11 | 6 | 5 | 0 | 8 | 10 | 7 | 10 | 7 | 10 | 7 | 10 | 11 | 6 | 3 | 1 | 4 | 2 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 3 | 4 | 0 | 6.5000 | 1.5000 | 11 | 6 | 7 | 10 | 2.5000 | 1.5000 |
|---|
| 41 | 101 | 41 | 1 | 24 | 37 | 14 | 2 | 14 | 2 | 12 | 6 | 12 | 8 | 8 | 8 | 8 | 0 | 1 | 3 | 6 | 11 | 5 | 11 | 5 | 8 | 8 | 11 | 5 | 6 | 0 | 3 | 6 | 4 | 2 | 1 | 2 | 1 | 1 | 1 | 0 | 0 | 6 | 0 | 3 | 0 | 8 | 8 | 11 | 5 | 3 | 0 |
|---|
| 42 | 101 | 42 | 1 | 58 | 63 | 16 | 2 | 16 | 2 | 14 | 3 | 6 | 13 | 5 | 13 | 5 | 8 | 0 | 4 | 8 | 12 | 6 | 12 | 6 | 12 | 6 | 13 | 5 | 6 | 0 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 4 | 0 | 13 | 5 | 12 | 6 | 1 | 0 |
|---|
| 43 | 101 | 43 | 1 | 6 | 73 | 15 | 2 | 15 | 2 | 13 | 4 | 8 | 11 | 6 | 11 | 6 | 5 | 0 | 5 | 10 | 10 | 7 | 10 | 7 | 10 | 7 | 11 | 6 | 3 | 0 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 5 | 0 | 11 | 6 | 10 | 7 | 1 | 0 |
|---|
| 44 | 101 | 44 | 1 | 12 | 13 | 16 | 1 | 16 | 1 | 15 | 4 | 8 | 12 | 5 | 12 | 5 | 7 | 0 | 1 | 2 | 15 | 2 | 15 | 2 | 12 | 5 | 15 | 2 | 13 | 0 | 3 | 6 | 1 | 2 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 1 | 0 | 12 | 5 | 15 | 2 | 3 | 0 |
|---|
| 45 | 101 | 45 | 1 | 3 | 11 | 2 | 16 | 16 | 2 | 14 | 3 | 6 | 5 | 13 | 13 | 5 | 8 | 0 | 9 | 10 | 11 | 7 | 7 | 11 | 7 | 11 | 13 | 5 | 4 | 1 | 6 | 4 | 3 | 1 | 2 | 1 | 1 | 1 | 1 | 0 | 0 | 3 | 0 | 7 | 2 | 13 | 5 | 7 | 11 | 4 | 2 |
|---|
| 46 | 101 | 46 | 1 | 42 | 53 | 2 | 16 | 16 | 2 | 14 | 2 | 4 | 4 | 14 | 14 | 4 | 10 | 0 | 10 | 8 | 12 | 6 | 6 | 12 | 6 | 12 | 14 | 4 | 6 | 1 | 8 | 4 | 3 | 1 | 2 | 2 | 0 | 1 | 1 | 0 | 0 | 2 | 0 | 7 | 3 | 14 | 4 | 6 | 12 | 5 | 3 |
|---|
| 47 | 101 | 47 | 1 | 30 | 25 | 15 | 3 | 15 | 3 | 12 | 4 | 8 | 11 | 7 | 11 | 7 | 4 | 0 | 5 | 10 | 10 | 8 | 10 | 8 | 10 | 8 | 11 | 7 | 2 | 0 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 5 | 0 | 11 | 7 | 10 | 8 | 1 | 0 |
|---|
| 48 | 101 | 48 | 1 | 58 | 27 | 3 | 15 | 15 | 3 | 12 | 6 | 12 | 9 | 9 | 9 | 9 | 0 | 1 | 3 | 6 | 6 | 12 | 12 | 6 | 9 | 9 | 12 | 6 | 6 | 0 | 3 | 6 | 4 | 2 | 2 | 1 | 0 | 1 | 1 | 0 | 0 | 6 | 0 | 3 | 0 | 9 | 9 | 12 | 6 | 3 | 0 |
|---|
| 49 | 101 | 49 | 1 | 52 | 57 | 1 | 16 | 16 | 1 | 15 | 5 | 10 | 6 | 11 | 11 | 6 | 5 | 0 | 8 | 14 | 9 | 8 | 8 | 9 | 8 | 9 | 11 | 6 | 1 | 1 | 3 | 4 | 3 | 1 | 2 | 2 | 0 | 1 | 1 | 0 | 0 | 5 | 0 | 7.5000 | 0.5000 | 11 | 6 | 8 | 9 | 2.5000 | 0.5000 |
|---|
| 50 | 101 | 50 | 1 | 48 | 41 | 3 | 16 | 16 | 3 | 13 | 2 | 4 | 5 | 14 | 14 | 5 | 9 | 0 | 9 | 8 | 12 | 7 | 7 | 12 | 7 | 12 | 14 | 5 | 5 | 1 | 7 | 4 | 3 | 1 | 2 | 2 | 0 | 1 | 1 | 0 | 0 | 2 | 0 | 6.5000 | 2.5000 | 14 | 5 | 7 | 12 | 4.5000 | 2.5000 |
|---|
| 51 | 101 | 51 | 1 | 46 | 43 | 16 | 2 | 16 | 2 | 14 | 3 | 6 | 13 | 5 | 13 | 5 | 8 | 0 | 10 | 8 | 6 | 12 | 6 | 12 | 6 | 12 | 13 | 5 | 6 | 1 | 7 | 2 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 6 | 3 | 0 | 7 | 3 | 13 | 5 | 6 | 12 | 4 | 3 |
|---|
| 52 | 101 | 52 | 1 | 14 | 57 | 16 | 2 | 16 | 2 | 14 | 8 | 12 | 8 | 10 | 8 | 10 | 2 | 1 | 7 | 14 | 9 | 9 | 9 | 9 | 9 | 9 | 8 | 10 | 0 | 1 | -1 | 2 | 5 | 1 | 1 | 2 | 0 | 1 | 1 | 2 | 0 | 7 | 1 | 7 | 0 | 8 | 10 | 9 | 9 | 0 | 1 |
|---|
| 53 | 101 | 53 | 1 | 42 | 43 | 3 | 15 | 15 | 3 | 12 | 4 | 8 | 7 | 11 | 11 | 7 | 4 | 0 | 1 | 2 | 4 | 14 | 14 | 4 | 11 | 7 | 14 | 4 | 10 | 0 | 3 | 6 | 1 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 1 | 0 | 11 | 7 | 14 | 4 | 3 | 0 |
|---|
| 54 | 101 | 54 | 1 | 28 | 33 | 5 | 16 | 16 | 5 | 11 | 2 | 4 | 7 | 14 | 14 | 7 | 7 | 0 | 4 | 8 | 9 | 12 | 12 | 9 | 12 | 9 | 14 | 7 | 3 | 0 | 2 | 4 | 1 | 1 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 4 | 0 | 14 | 7 | 12 | 9 | 2 | 0 |
|---|
| 55 | 101 | 55 | 2 | 36 | 13 | 13 | 2 | 13 | 2 | 11 | 5 | 10 | 8 | 7 | 8 | 7 | 1 | 0 | 1 | 2 | 12 | 3 | 12 | 3 | 8 | 7 | 12 | 3 | 9 | 0 | 4 | 8 | 1 | 2 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 1 | 0 | 8 | 7 | 12 | 3 | 4 | 0 |
|---|
| 56 | 101 | 56 | 2 | 42 | 53 | 15 | 3 | 15 | 3 | 12 | 1 | 2 | 14 | 4 | 14 | 4 | 10 | 0 | 10 | 4 | 5 | 13 | 5 | 13 | 5 | 13 | 14 | 4 | 8 | 1 | 9 | 2 | 3 | 1 | 1 | 2 | 0 | 1 | 1 | 0 | 8 | 1 | 0 | 6 | 4 | 14 | 4 | 5 | 13 | 5 | 4 |
|---|
| 57 | 101 | 57 | 2 | 34 | 17 | 3 | 16 | 16 | 3 | 13 | 8 | 10 | 11 | 8 | 8 | 11 | 3 | 1 | 2 | 4 | 5 | 14 | 14 | 5 | 8 | 11 | 14 | 5 | 9 | 0 | 6 | 6 | 3 | 2 | 2 | 1 | 0 | 1 | 1 | 3 | 0 | 6.5000 | 1.5000 | 2 | 0 | 8 | 11 | 14 | 5 | 4.5000 | 1.5000 |
|---|
| 58 | 101 | 58 | 2 | 30 | 35 | 15 | 3 | 15 | 3 | 12 | 3 | 6 | 12 | 6 | 12 | 6 | 6 | 0 | 4 | 8 | 11 | 7 | 11 | 7 | 11 | 7 | 12 | 6 | 4 | 0 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 3 | 0 | 4 | 0 | 12 | 6 | 11 | 7 | 1 | 0 |
|---|
| 59 | 101 | 59 | 2 | 66 | 7 | 2 | 15 | 15 | 2 | 13 | 1 | 2 | 3 | 14 | 14 | 3 | 11 | 0 | 3 | 6 | 5 | 12 | 12 | 5 | 12 | 5 | 14 | 3 | 7 | 0 | 2 | 4 | 1 | 1 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 | 0 | 14 | 3 | 12 | 5 | 2 | 0 |
|---|
| 60 | 101 | 60 | 2 | 48 | 47 | 2 | 16 | 16 | 2 | 14 | 9 | 10 | 11 | 7 | 7 | 11 | 4 | 1 | 7 | 14 | 9 | 9 | 9 | 9 | 9 | 9 | 7 | 11 | 0 | 1 | -2 | 4 | 5 | 1 | 2 | 2 | 0 | 1 | 1 | 4 | 0 | 7 | 2 | 7 | 0 | 7 | 11 | 9 | 9 | 0 | 2 |
|---|
| 61 | 101 | 61 | 2 | 22 | 27 | 3 | 15 | 15 | 3 | 12 | 1 | 2 | 4 | 14 | 14 | 4 | 10 | 0 | 9 | 6 | 12 | 6 | 6 | 12 | 6 | 12 | 14 | 4 | 6 | 1 | 8 | 4 | 3 | 1 | 2 | 2 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 6 | 3 | 14 | 4 | 6 | 12 | 5 | 3 |
|---|
| 62 | 101 | 62 | 2 | 7 | 31 | 3 | 16 | 16 | 3 | 13 | 3 | 6 | 6 | 13 | 13 | 6 | 7 | 0 | 8 | 10 | 11 | 8 | 8 | 11 | 8 | 11 | 13 | 6 | 3 | 1 | 5 | 4 | 3 | 1 | 2 | 2 | 0 | 1 | 1 | 0 | 0 | 3 | 0 | 6.5000 | 1.5000 | 13 | 6 | 8 | 11 | 3.5000 | 1.5000 |
|---|
| 63 | 101 | 63 | 2 | 16 | 53 | 16 | 3 | 16 | 3 | 13 | 7 | 12 | 9 | 10 | 9 | 10 | 1 | 1 | 2 | 4 | 14 | 5 | 14 | 5 | 9 | 10 | 14 | 5 | 9 | 0 | 5 | 8 | 3 | 2 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 6.5000 | 0.5000 | 2 | 0 | 9 | 10 | 14 | 5 | 4.5000 | 0.5000 |
|---|
| 64 | 101 | 64 | 2 | 50 | 49 | 15 | 2 | 15 | 2 | 13 | 3 | 6 | 12 | 5 | 12 | 5 | 7 | 0 | 9 | 8 | 6 | 11 | 6 | 11 | 6 | 11 | 12 | 5 | 5 | 1 | 6 | 2 | 3 | 1 | 1 | 2 | 0 | 1 | 1 | 0 | 5 | 3 | 0 | 6.5000 | 2.5000 | 12 | 5 | 6 | 11 | 3.5000 | 2.5000 |
|---|
| 65 | 101 | 65 | 2 | 26 | 33 | 1 | 15 | 15 | 1 | 14 | 8 | 12 | 9 | 7 | 7 | 9 | 2 | 1 | 3 | 6 | 4 | 12 | 12 | 4 | 7 | 9 | 12 | 4 | 8 | 0 | 5 | 6 | 3 | 2 | 2 | 1 | 0 | 1 | 1 | 2 | 0 | 7 | 1 | 3 | 0 | 7 | 9 | 12 | 4 | 4 | 1 |
|---|
| 66 | 101 | 66 | 2 | 56 | 61 | 15 | 1 | 15 | 1 | 14 | 7 | 14 | 8 | 8 | 8 | 8 | 0 | 1 | 10 | 8 | 5 | 11 | 5 | 11 | 8 | 8 | 5 | 11 | 6 | 1 | -3 | 6 | 5 | 2 | 1 | 1 | 0 | 1 | 1 | 0 | 6 | 7 | 0 | 7 | 3 | 8 | 8 | 5 | 11 | 0 | 3 |
|---|
| 67 | 101 | 67 | 2 | 12 | 61 | 5 | 16 | 16 | 5 | 11 | 1 | 2 | 6 | 15 | 15 | 6 | 9 | 0 | 3 | 6 | 8 | 13 | 13 | 8 | 13 | 8 | 15 | 6 | 5 | 0 | 2 | 4 | 1 | 1 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 | 0 | 15 | 6 | 13 | 8 | 2 | 0 |
|---|
| 68 | 101 | 68 | 2 | 4 | 77 | 15 | 1 | 15 | 1 | 14 | 8 | 12 | 7 | 9 | 7 | 9 | 2 | 1 | 2 | 4 | 13 | 3 | 13 | 3 | 7 | 9 | 13 | 3 | 10 | 0 | 6 | 8 | 3 | 2 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 7 | 1 | 2 | 0 | 7 | 9 | 13 | 3 | 5 | 1 |
|---|
| 69 | 101 | 69 | 2 | 44 | 45 | 1 | 15 | 15 | 1 | 14 | 8 | 12 | 9 | 7 | 7 | 9 | 2 | 1 | 7 | 14 | 8 | 8 | 8 | 8 | 8 | 8 | 7 | 9 | 0 | 1 | -1 | 2 | 5 | 1 | 2 | 1 | 1 | 1 | 1 | 2 | 0 | 7 | 1 | 7 | 0 | 7 | 9 | 8 | 8 | 0 | 1 |
|---|
| 70 | 101 | 70 | 2 | 48 | 53 | 15 | 1 | 15 | 1 | 14 | 7 | 14 | 8 | 8 | 8 | 8 | 0 | 1 | 3 | 6 | 12 | 4 | 12 | 4 | 8 | 8 | 12 | 4 | 8 | 0 | 4 | 8 | 4 | 2 | 1 | 2 | 1 | 1 | 1 | 0 | 0 | 7 | 0 | 3 | 0 | 8 | 8 | 12 | 4 | 4 | 0 |
|---|
| 71 | 101 | 71 | 2 | 64 | 19 | 2 | 16 | 16 | 2 | 14 | 11 | 6 | 13 | 5 | 5 | 13 | 8 | 1 | 1 | 2 | 3 | 15 | 15 | 3 | 5 | 13 | 15 | 3 | 12 | 0 | 10 | 4 | 3 | 2 | 2 | 2 | 1 | 1 | 1 | 8 | 0 | 7 | 4 | 1 | 0 | 5 | 13 | 15 | 3 | 6 | 4 |
|---|
| 72 | 101 | 72 | 2 | 16 | 17 | 2 | 15 | 15 | 2 | 13 | 1 | 2 | 3 | 14 | 14 | 3 | 11 | 0 | 10 | 6 | 12 | 5 | 5 | 12 | 5 | 12 | 14 | 3 | 7 | 1 | 9 | 4 | 3 | 1 | 2 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 6.5000 | 3.5000 | 14 | 3 | 5 | 12 | 5.5000 | 3.5000 |
|---|
| 73 | 101 | 73 | 2 | 28 | 29 | 1 | 15 | 15 | 1 | 14 | 6 | 12 | 7 | 9 | 9 | 7 | 2 | 0 | 7 | 14 | 8 | 8 | 8 | 8 | 8 | 8 | 9 | 7 | 0 | 1 | 1 | 2 | 4 | 1 | 2 | 1 | 1 | 1 | 1 | 0 | 0 | 6 | 0 | 7 | 0 | 9 | 7 | 8 | 8 | 1 | 0 |
|---|
| 74 | 101 | 74 | 2 | 36 | 37 | 15 | 1 | 15 | 1 | 14 | 7 | 14 | 8 | 8 | 8 | 8 | 0 | 1 | 4 | 8 | 11 | 5 | 11 | 5 | 8 | 8 | 11 | 5 | 6 | 0 | 3 | 6 | 4 | 2 | 1 | 2 | 1 | 1 | 1 | 0 | 0 | 7 | 0 | 4 | 0 | 8 | 8 | 11 | 5 | 3 | 0 |
|---|
| 75 | 101 | 75 | 2 | 18 | 49 | 16 | 2 | 16 | 2 | 14 | 5 | 10 | 11 | 7 | 11 | 7 | 4 | 0 | 6 | 12 | 10 | 8 | 10 | 8 | 10 | 8 | 11 | 7 | 2 | 0 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 5 | 0 | 6 | 0 | 11 | 7 | 10 | 8 | 1 | 0 |
|---|
| 76 | 101 | 76 | 2 | 10 | 65 | 2 | 16 | 16 | 2 | 14 | 3 | 6 | 5 | 13 | 13 | 5 | 8 | 0 | 1 | 2 | 3 | 15 | 15 | 3 | 13 | 5 | 15 | 3 | 12 | 0 | 2 | 4 | 1 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 1 | 0 | 13 | 5 | 15 | 3 | 2 | 0 |
|---|
| 77 | 101 | 77 | 2 | 2 | 3 | 15 | 5 | 15 | 5 | 10 | 4 | 8 | 11 | 9 | 11 | 9 | 2 | 0 | 5 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 11 | 9 | 0 | 1 | 1 | 2 | 4 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 4 | 0 | 5 | 0 | 11 | 9 | 10 | 10 | 1 | 0 |
|---|
| 78 | 101 | 78 | 2 | 32 | 33 | 15 | 1 | 15 | 1 | 14 | 1 | 2 | 14 | 2 | 14 | 2 | 12 | 0 | 2 | 4 | 13 | 3 | 13 | 3 | 13 | 3 | 14 | 2 | 10 | 0 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 14 | 2 | 13 | 3 | 1 | 0 |
|---|
| 79 | 101 | 79 | 2 | 50 | 51 | 3 | 15 | 15 | 3 | 12 | 1 | 2 | 4 | 14 | 14 | 4 | 10 | 0 | 3 | 6 | 6 | 12 | 12 | 6 | 12 | 6 | 14 | 4 | 6 | 0 | 2 | 4 | 1 | 1 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 | 0 | 14 | 4 | 12 | 6 | 2 | 0 |
|---|
| 80 | 101 | 80 | 2 | 44 | 49 | 15 | 5 | 15 | 5 | 10 | 6 | 8 | 9 | 11 | 9 | 11 | 2 | 1 | 5 | 10 | 10 | 10 | 10 | 10 | 10 | 10 | 9 | 11 | 0 | 1 | -1 | 2 | 5 | 1 | 1 | 2 | 0 | 1 | 1 | 2 | 0 | 5 | 1 | 5 | 0 | 9 | 11 | 10 | 10 | 0 | 1 |
|---|
| 81 | 101 | 81 | 2 | 80 | 79 | 16 | 5 | 16 | 5 | 11 | 1 | 2 | 15 | 6 | 15 | 6 | 9 | 0 | 9 | 4 | 7 | 14 | 7 | 14 | 7 | 14 | 15 | 6 | 7 | 1 | 8 | 2 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 7 | 1 | 0 | 5.5000 | 3.5000 | 15 | 6 | 7 | 14 | 4.5000 | 3.5000 |
|---|
| 82 | 101 | 82 | 2 | 22 | 41 | 15 | 3 | 15 | 3 | 12 | 3 | 6 | 12 | 6 | 12 | 6 | 6 | 0 | 8 | 8 | 7 | 11 | 7 | 11 | 7 | 11 | 12 | 6 | 4 | 1 | 5 | 2 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 4 | 3 | 0 | 6 | 2 | 12 | 6 | 7 | 11 | 3 | 2 |
|---|
| 83 | 101 | 83 | 2 | 48 | 49 | 3 | 16 | 16 | 3 | 13 | 3 | 6 | 6 | 13 | 13 | 6 | 7 | 0 | 5 | 10 | 8 | 11 | 11 | 8 | 11 | 8 | 13 | 6 | 3 | 0 | 2 | 4 | 1 | 1 | 2 | 2 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 5 | 0 | 13 | 6 | 11 | 8 | 2 | 0 |
|---|
| 84 | 101 | 84 | 2 | 68 | 67 | 16 | 2 | 16 | 2 | 14 | 9 | 10 | 7 | 11 | 7 | 11 | 4 | 1 | 1 | 2 | 15 | 3 | 15 | 3 | 7 | 11 | 15 | 3 | 12 | 0 | 8 | 8 | 3 | 2 | 1 | 2 | 1 | 1 | 1 | 4 | 0 | 7 | 2 | 1 | 0 | 7 | 11 | 15 | 3 | 6 | 2 |
|---|
| 85 | 101 | 85 | 2 | 32 | 21 | 5 | 16 | 16 | 5 | 11 | 1 | 2 | 6 | 15 | 15 | 6 | 9 | 0 | 8 | 6 | 13 | 8 | 8 | 13 | 8 | 13 | 15 | 6 | 5 | 1 | 7 | 4 | 3 | 1 | 2 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 5.5000 | 2.5000 | 15 | 6 | 8 | 13 | 4.5000 | 2.5000 |
|---|
| 86 | 101 | 86 | 2 | 3 | 11 | 15 | 2 | 15 | 2 | 13 | 3 | 6 | 12 | 5 | 12 | 5 | 7 | 0 | 4 | 8 | 11 | 6 | 11 | 6 | 11 | 6 | 12 | 5 | 5 | 0 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 3 | 0 | 4 | 0 | 12 | 5 | 11 | 6 | 1 | 0 |
|---|
| 87 | 101 | 87 | 2 | 26 | 31 | 16 | 5 | 16 | 5 | 11 | 1 | 2 | 15 | 6 | 15 | 6 | 9 | 0 | 2 | 4 | 14 | 7 | 14 | 7 | 14 | 7 | 15 | 6 | 7 | 0 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 15 | 6 | 14 | 7 | 1 | 0 |
|---|
| 88 | 101 | 88 | 2 | 26 | 27 | 15 | 3 | 15 | 3 | 12 | 1 | 2 | 14 | 4 | 14 | 4 | 10 | 0 | 2 | 4 | 13 | 5 | 13 | 5 | 13 | 5 | 14 | 4 | 8 | 0 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 14 | 4 | 13 | 5 | 1 | 0 |
|---|
| 89 | 101 | 89 | 2 | 28 | 29 | 16 | 2 | 16 | 2 | 14 | 5 | 10 | 11 | 7 | 11 | 7 | 4 | 0 | 1 | 2 | 15 | 3 | 15 | 3 | 11 | 7 | 15 | 3 | 12 | 0 | 4 | 8 | 1 | 2 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 1 | 0 | 11 | 7 | 15 | 3 | 4 | 0 |
|---|
| 90 | 101 | 90 | 2 | 7 | 31 | 15 | 1 | 15 | 1 | 14 | 6 | 12 | 9 | 7 | 9 | 7 | 2 | 0 | 2 | 4 | 13 | 3 | 13 | 3 | 9 | 7 | 13 | 3 | 10 | 0 | 4 | 8 | 1 | 2 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 2 | 0 | 9 | 7 | 13 | 3 | 4 | 0 |
|---|
| 91 | 101 | 91 | 2 | 64 | 19 | 15 | 1 | 15 | 1 | 14 | 1 | 2 | 14 | 2 | 14 | 2 | 12 | 0 | 12 | 4 | 3 | 13 | 3 | 13 | 3 | 13 | 14 | 2 | 10 | 1 | 11 | 2 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 10 | 1 | 0 | 7 | 5 | 14 | 2 | 3 | 13 | 6 | 5 |
|---|
| 92 | 101 | 92 | 2 | 46 | 47 | 16 | 3 | 16 | 3 | 13 | 6 | 12 | 10 | 9 | 10 | 9 | 1 | 0 | 2 | 4 | 14 | 5 | 14 | 5 | 10 | 9 | 14 | 5 | 9 | 0 | 4 | 8 | 1 | 2 | 1 | 2 | 1 | 0 | 0 | 0 | 0 | 6 | 0 | 2 | 0 | 10 | 9 | 14 | 5 | 4 | 0 |
|---|
| 93 | 101 | 93 | 2 | 64 | 65 | 15 | 1 | 15 | 1 | 14 | 7 | 14 | 8 | 8 | 8 | 8 | 0 | 1 | 11 | 6 | 4 | 12 | 4 | 12 | 8 | 8 | 4 | 12 | 8 | 1 | -4 | 8 | 5 | 2 | 1 | 1 | 0 | 1 | 1 | 0 | 8 | 7 | 0 | 7 | 4 | 8 | 8 | 4 | 12 | 0 | 4 |
|---|
| 94 | 101 | 94 | 2 | 76 | 77 | 13 | 2 | 13 | 2 | 11 | 6 | 10 | 7 | 8 | 7 | 8 | 1 | 1 | 1 | 2 | 12 | 3 | 12 | 3 | 7 | 8 | 12 | 3 | 9 | 0 | 5 | 8 | 3 | 2 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 5.5000 | 0.5000 | 1 | 0 | 7 | 8 | 12 | 3 | 4.5000 | 0.5000 |
|---|
| 95 | 101 | 95 | 2 | 24 | 25 | 2 | 16 | 16 | 2 | 14 | 7 | 14 | 9 | 9 | 9 | 9 | 0 | 1 | 5 | 10 | 7 | 11 | 11 | 7 | 9 | 9 | 11 | 7 | 4 | 0 | 2 | 4 | 4 | 2 | 2 | 2 | 1 | 1 | 1 | 0 | 0 | 7 | 0 | 5 | 0 | 9 | 9 | 11 | 7 | 2 | 0 |
|---|
| 96 | 101 | 96 | 2 | 64 | 63 | 3 | 16 | 16 | 3 | 13 | 6 | 12 | 9 | 10 | 10 | 9 | 1 | 0 | 3 | 6 | 6 | 13 | 13 | 6 | 10 | 9 | 13 | 6 | 7 | 0 | 3 | 6 | 1 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 3 | 0 | 10 | 9 | 13 | 6 | 3 | 0 |
|---|
| 97 | 101 | 97 | 2 | 22 | 21 | 4 | 16 | 16 | 4 | 12 | 3 | 6 | 7 | 13 | 13 | 7 | 6 | 0 | 7 | 10 | 11 | 9 | 9 | 11 | 9 | 11 | 13 | 7 | 2 | 1 | 4 | 4 | 3 | 1 | 2 | 2 | 0 | 1 | 1 | 0 | 0 | 3 | 0 | 6 | 1 | 13 | 7 | 9 | 11 | 3 | 1 |
|---|
| 98 | 101 | 98 | 2 | 20 | 25 | 2 | 16 | 16 | 2 | 14 | 9 | 10 | 11 | 7 | 7 | 11 | 4 | 1 | 2 | 4 | 4 | 14 | 14 | 4 | 7 | 11 | 14 | 4 | 10 | 0 | 7 | 6 | 3 | 2 | 2 | 1 | 0 | 1 | 1 | 4 | 0 | 7 | 2 | 2 | 0 | 7 | 11 | 14 | 4 | 5 | 2 |
|---|
| 99 | 101 | 99 | 2 | 8 | 69 | 3 | 16 | 16 | 3 | 13 | 5 | 10 | 8 | 11 | 11 | 8 | 3 | 0 | 2 | 4 | 5 | 14 | 14 | 5 | 11 | 8 | 14 | 5 | 9 | 0 | 3 | 6 | 1 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 2 | 0 | 11 | 8 | 14 | 5 | 3 | 0 |
|---|
| 100 | 101 | 100 | 2 | 34 | 35 | 1 | 15 | 15 | 1 | 14 | 4 | 8 | 5 | 11 | 11 | 5 | 6 | 0 | 1 | 2 | 2 | 14 | 14 | 2 | 11 | 5 | 14 | 2 | 12 | 0 | 3 | 6 | 1 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 1 | 0 | 11 | 5 | 14 | 2 | 3 | 0 |
|---|
| ⋮ |
|---|
So we need participants' ID numbers, a0-b2, and their choices in the format 1 or 2. In the above data frame, those are columns 1, 6, 7, 13, 14, 21, 22, and 26 so let's extract those. Also, they only analyzed the rank-reverse condition (perhaps this is why we couldn't reliably recover parameters) so that is "trail_type" 3.
trialData = trialData(trialData.trail_type == 3, [1, 6, 7, 13, 14, 21, 22, 36]);
trialData.Properties.VariableNames = {'SubjectID', 'a0', 'b0', 'a1', 'b1', 'a2', 'b2', 'Chose1'};
trialData.Chose1 = trialData.Chose1 - 1;
trialData.Prob1 = zeros(height(trialData), 1)
trialData = 3704×9 table
| | SubjectID | a0 | b0 | a1 | b1 | a2 | b2 | Chose1 | Prob1 |
|---|
| 1 | 101 | 16 | 5 | 14 | 7 | 8 | 13 | 1 | 0 |
|---|
| 2 | 101 | 16 | 1 | 5 | 12 | 15 | 2 | 0 | 0 |
|---|
| 3 | 101 | 16 | 3 | 15 | 4 | 5 | 14 | 0 | 0 |
|---|
| 4 | 101 | 5 | 16 | 11 | 10 | 7 | 14 | 0 | 0 |
|---|
| 5 | 101 | 15 | 2 | 8 | 9 | 13 | 4 | 0 | 0 |
|---|
| 6 | 101 | 5 | 16 | 7 | 14 | 12 | 9 | 1 | 0 |
|---|
| 7 | 101 | 1 | 15 | 12 | 4 | 2 | 14 | 0 | 0 |
|---|
| 8 | 101 | 2 | 15 | 10 | 7 | 4 | 13 | 0 | 0 |
|---|
| 9 | 101 | 15 | 3 | 8 | 10 | 14 | 4 | 0 | 0 |
|---|
| 10 | 101 | 3 | 15 | 10 | 8 | 5 | 13 | 0 | 0 |
|---|
| 11 | 101 | 15 | 1 | 6 | 10 | 14 | 2 | 0 | 0 |
|---|
| 12 | 101 | 3 | 15 | 11 | 7 | 4 | 14 | 1 | 0 |
|---|
| 13 | 101 | 15 | 3 | 11 | 7 | 8 | 10 | 0 | 0 |
|---|
| 14 | 101 | 14 | 2 | 13 | 3 | 4 | 12 | 0 | 0 |
|---|
| 15 | 101 | 15 | 2 | 7 | 10 | 14 | 3 | 0 | 0 |
|---|
| 16 | 101 | 3 | 16 | 12 | 7 | 4 | 15 | 0 | 0 |
|---|
| 17 | 101 | 15 | 2 | 11 | 6 | 7 | 10 | 0 | 0 |
|---|
| 18 | 101 | 2 | 16 | 5 | 13 | 11 | 7 | 0 | 0 |
|---|
| 19 | 101 | 2 | 16 | 4 | 14 | 12 | 6 | 1 | 0 |
|---|
| 20 | 101 | 1 | 16 | 6 | 11 | 9 | 8 | 1 | 0 |
|---|
| 21 | 101 | 3 | 16 | 5 | 14 | 12 | 7 | 1 | 0 |
|---|
| 22 | 101 | 16 | 2 | 13 | 5 | 6 | 12 | 0 | 0 |
|---|
| 23 | 101 | 15 | 3 | 14 | 4 | 5 | 13 | 1 | 0 |
|---|
| 24 | 101 | 3 | 16 | 11 | 8 | 5 | 14 | 0 | 0 |
|---|
| 25 | 101 | 3 | 15 | 4 | 14 | 12 | 6 | 1 | 0 |
|---|
| 26 | 101 | 3 | 16 | 6 | 13 | 11 | 8 | 1 | 0 |
|---|
| 27 | 101 | 16 | 3 | 9 | 10 | 14 | 5 | 0 | 0 |
|---|
| 28 | 101 | 15 | 2 | 12 | 5 | 6 | 11 | 1 | 0 |
|---|
| 29 | 101 | 1 | 15 | 9 | 7 | 4 | 12 | 0 | 0 |
|---|
| 30 | 101 | 15 | 1 | 7 | 9 | 13 | 3 | 1 | 0 |
|---|
| 31 | 101 | 2 | 16 | 13 | 5 | 3 | 15 | 1 | 0 |
|---|
| 32 | 101 | 2 | 15 | 3 | 14 | 12 | 5 | 0 | 0 |
|---|
| 33 | 101 | 16 | 5 | 15 | 6 | 7 | 14 | 0 | 0 |
|---|
| 34 | 101 | 15 | 3 | 12 | 6 | 7 | 11 | 0 | 0 |
|---|
| 35 | 101 | 16 | 2 | 7 | 11 | 15 | 3 | 1 | 0 |
|---|
| 36 | 101 | 5 | 16 | 6 | 15 | 13 | 8 | 0 | 0 |
|---|
| 37 | 101 | 15 | 1 | 14 | 2 | 3 | 13 | 0 | 0 |
|---|
| 38 | 101 | 13 | 2 | 7 | 8 | 12 | 3 | 0 | 0 |
|---|
| 39 | 101 | 4 | 16 | 7 | 13 | 11 | 9 | 1 | 0 |
|---|
| 40 | 101 | 2 | 16 | 11 | 7 | 4 | 14 | 0 | 0 |
|---|
| 41 | 101 | 16 | 2 | 11 | 7 | 8 | 10 | 1 | 0 |
|---|
| 42 | 101 | 1 | 15 | 11 | 5 | 2 | 14 | 0 | 0 |
|---|
| 43 | 101 | 16 | 3 | 8 | 11 | 15 | 4 | 1 | 0 |
|---|
| 44 | 101 | 3 | 16 | 10 | 9 | 6 | 13 | 0 | 0 |
|---|
| 45 | 101 | 2 | 14 | 9 | 7 | 4 | 12 | 0 | 0 |
|---|
| 46 | 101 | 2 | 14 | 10 | 6 | 3 | 13 | 1 | 0 |
|---|
| 47 | 101 | 16 | 2 | 15 | 3 | 4 | 14 | 1 | 0 |
|---|
| 48 | 101 | 3 | 16 | 4 | 15 | 13 | 6 | 1 | 0 |
|---|
| 49 | 101 | 5 | 16 | 12 | 9 | 6 | 15 | 1 | 0 |
|---|
| 50 | 101 | 3 | 15 | 5 | 13 | 11 | 7 | 0 | 0 |
|---|
| 51 | 101 | 16 | 5 | 10 | 11 | 15 | 6 | 1 | 0 |
|---|
| 52 | 101 | 16 | 1 | 7 | 10 | 14 | 3 | 1 | 0 |
|---|
| 53 | 101 | 2 | 15 | 4 | 13 | 11 | 6 | 0 | 0 |
|---|
| 54 | 101 | 1 | 16 | 13 | 4 | 2 | 15 | 0 | 0 |
|---|
| 55 | 101 | 16 | 2 | 8 | 10 | 14 | 4 | 1 | 0 |
|---|
| 56 | 101 | 2 | 16 | 12 | 6 | 3 | 15 | 1 | 0 |
|---|
| 57 | 101 | 15 | 3 | 13 | 5 | 6 | 12 | 0 | 0 |
|---|
| 58 | 101 | 2 | 16 | 6 | 12 | 10 | 8 | 1 | 0 |
|---|
| 59 | 101 | 16 | 2 | 12 | 6 | 7 | 11 | 1 | 0 |
|---|
| 60 | 101 | 16 | 5 | 13 | 8 | 9 | 12 | 1 | 0 |
|---|
| 61 | 101 | 15 | 2 | 14 | 3 | 4 | 13 | 0 | 0 |
|---|
| 62 | 101 | 16 | 1 | 6 | 11 | 15 | 2 | 1 | 0 |
|---|
| 63 | 101 | 14 | 2 | 7 | 9 | 13 | 3 | 0 | 0 |
|---|
| 64 | 101 | 1 | 15 | 3 | 13 | 11 | 5 | 1 | 0 |
|---|
| 65 | 101 | 15 | 2 | 13 | 4 | 5 | 12 | 0 | 0 |
|---|
| 66 | 101 | 2 | 15 | 11 | 6 | 3 | 14 | 1 | 0 |
|---|
| 67 | 102 | 16 | 5 | 14 | 7 | 8 | 13 | 0 | 0 |
|---|
| 68 | 102 | 16 | 1 | 5 | 12 | 15 | 2 | 0 | 0 |
|---|
| 69 | 102 | 16 | 3 | 15 | 4 | 5 | 14 | 1 | 0 |
|---|
| 70 | 102 | 5 | 16 | 11 | 10 | 7 | 14 | 0 | 0 |
|---|
| 71 | 102 | 15 | 2 | 8 | 9 | 13 | 4 | 0 | 0 |
|---|
| 72 | 102 | 5 | 16 | 7 | 14 | 12 | 9 | 0 | 0 |
|---|
| 73 | 102 | 1 | 15 | 12 | 4 | 2 | 14 | 1 | 0 |
|---|
| 74 | 102 | 2 | 15 | 10 | 7 | 4 | 13 | 0 | 0 |
|---|
| 75 | 102 | 15 | 3 | 8 | 10 | 14 | 4 | 1 | 0 |
|---|
| 76 | 102 | 3 | 15 | 10 | 8 | 5 | 13 | 1 | 0 |
|---|
| 77 | 102 | 15 | 1 | 6 | 10 | 14 | 2 | 1 | 0 |
|---|
| 78 | 102 | 3 | 15 | 11 | 7 | 4 | 14 | 1 | 0 |
|---|
| 79 | 102 | 15 | 3 | 11 | 7 | 8 | 10 | 0 | 0 |
|---|
| 80 | 102 | 14 | 2 | 13 | 3 | 4 | 12 | 0 | 0 |
|---|
| 81 | 102 | 15 | 2 | 7 | 10 | 14 | 3 | 1 | 0 |
|---|
| 82 | 102 | 3 | 16 | 12 | 7 | 4 | 15 | 1 | 0 |
|---|
| 83 | 102 | 15 | 2 | 11 | 6 | 7 | 10 | 0 | 0 |
|---|
| 84 | 102 | 2 | 16 | 5 | 13 | 11 | 7 | 0 | 0 |
|---|
| 85 | 102 | 2 | 16 | 4 | 14 | 12 | 6 | 0 | 0 |
|---|
| 86 | 102 | 1 | 16 | 6 | 11 | 9 | 8 | 0 | 0 |
|---|
| 87 | 102 | 3 | 16 | 5 | 14 | 12 | 7 | 1 | 0 |
|---|
| 88 | 102 | 16 | 2 | 13 | 5 | 6 | 12 | 0 | 0 |
|---|
| 89 | 102 | 15 | 3 | 14 | 4 | 5 | 13 | 0 | 0 |
|---|
| 90 | 102 | 3 | 16 | 11 | 8 | 5 | 14 | 1 | 0 |
|---|
| 91 | 102 | 3 | 15 | 4 | 14 | 12 | 6 | 0 | 0 |
|---|
| 92 | 102 | 3 | 16 | 6 | 13 | 11 | 8 | 0 | 0 |
|---|
| 93 | 102 | 16 | 3 | 9 | 10 | 14 | 5 | 1 | 0 |
|---|
| 94 | 102 | 15 | 2 | 12 | 5 | 6 | 11 | 0 | 0 |
|---|
| 95 | 102 | 1 | 15 | 9 | 7 | 4 | 12 | 1 | 0 |
|---|
| 96 | 102 | 15 | 1 | 7 | 9 | 13 | 3 | 1 | 0 |
|---|
| 97 | 102 | 2 | 16 | 13 | 5 | 3 | 15 | 1 | 0 |
|---|
| 98 | 102 | 2 | 15 | 3 | 14 | 12 | 5 | 0 | 0 |
|---|
| 99 | 102 | 16 | 5 | 15 | 6 | 7 | 14 | 0 | 0 |
|---|
| 100 | 102 | 15 | 3 | 12 | 6 | 7 | 11 | 0 | 0 |
|---|
| ⋮ |
|---|
Now we can define some necessary variables and functions
included_subjects = unique(trialData.SubjectID);
grab_data = @(subject) trialData(trialData.SubjectID == subject, 2:end);
function updated = addPredictions(trialData, subject, predictions)
trialData.Prob1(trialData.SubjectID == subject) = predictions;
Which allows us to Recover Free Parameters and Define Predicted Decisions
for i = 1:length(included_subjects)
df = grab_data(included_subjects(i));
result = optimize(@obj_function, initial_params, lower_bounds, upper_bounds, df);
df.Prob1 = generatePredictions(result, df);
model_SS = sum((df.Chose1 - df.Prob1).^2);
model_NLL = -2 * sum(df.Chose1 .* log(df.Prob1) + (1 - df.Chose1) .* log(1 - df.Prob1));
subjectData(i, :) = {included_subjects(i), result(1), result(2), result(3), result(4), result(5), result(6), model_SS, model_NLL};
trialData = addPredictions(trialData, included_subjects(i), df.Prob1);
subjectData.Properties.VariableNames = {'subjectID', 'Alpha', 'Delta', 'Rho', 'Beta', 'Epsilon', 'Gamma', 'SS', 'Deviance'}
subjectData = 57×9 table
| | subjectID | Alpha | Delta | Rho | Beta | Epsilon | Gamma | SS | Deviance |
|---|
| 1 | 101 | 0.4136 | 1.4323 | 1.0577 | 4.2055 | 0.4394 | -0.0517 | 16.1212 | 89.9689 |
|---|
| 2 | 102 | 1.2781 | 0.0897 | 1.3041 | 5.7155 | 0.1256 | -0.3447 | 7.3522 | 48.5749 |
|---|
| 3 | 103 | 1.5895 | 0.1032 | 0.6247 | 7.0136 | 0.0157 | -0.4981 | 0.9688 | 8.9042 |
|---|
| 4 | 104 | 1.6709 | 0.0740 | 0.5221 | 7.7646 | 0.0152 | 0.4980 | 0.9697 | 8.9665 |
|---|
| 5 | 105 | 0.5115 | 1.3943 | 1.2966 | 4.4985 | 0.0313 | -0.4990 | 1.8750 | 14.9667 |
|---|
| 6 | 106 | 1.6946 | 0.0768 | 0.5445 | 8.0568 | 0.1529 | -0.1935 | 8.2339 | 54.0180 |
|---|
| 7 | 107 | 0.4745 | 1.5179 | 1.1049 | 4.3673 | 0.2273 | -0.1000 | 11.4545 | 69.9661 |
|---|
| 8 | 108 | 1.9824 | 1.3507 | 8.0938e-07 | 9.9194 | 0.4165 | -0.1919 | 15.5103 | 87.4497 |
|---|
| 9 | 109 | 0.4310 | 1.4541 | 1.0530 | 4.4265 | 0.3437 | -0.2727 | 12.1875 | 71.9739 |
|---|
| 10 | 110 | 1.0132e-06 | 0.1929 | 2.5954e-06 | 4.1437 | 0.0577 | -0.5000 | 3.4388 | 24.0005 |
|---|
| 11 | 111 | 1.5858 | 0.1039 | 0.6135 | 6.9625 | 4.2555e-05 | -0.0062 | 1.4398e-07 | 0.0060 |
|---|
| 12 | 112 | 1.6356 | 1.6726 | 2.1653e-06 | 9.4404 | 0.2840 | -0.1296 | 13.9254 | 80.4270 |
|---|
| 13 | 113 | 1.5858 | 0.1039 | 0.6135 | 6.9625 | 4.2555e-05 | -0.0062 | 1.4398e-07 | 0.0060 |
|---|
| 14 | 114 | 1.0576 | 0.0358 | 1.4700 | 6.3709 | 0.1822 | -0.3316 | 9.6595 | 60.1556 |
|---|
| 15 | 115 | 1.5744 | 0.1018 | 0.6222 | 6.8610 | 0.1364 | -0.4998 | 6.5455 | 38.6772 |
|---|
| 16 | 116 | 1.5972 | 0.1026 | 0.6677 | 6.9838 | 0.0909 | -0.4997 | 4.9091 | 31.2975 |
|---|
| 17 | 117 | 1.4259 | 1.9437 | 1.9508 | 9.8762 | 0.2435 | -0.0600 | 12.0097 | 72.2680 |
|---|
| 18 | 118 | 1.6835 | 0.0795 | 0.4731 | 7.8719 | 0.1979 | -0.3421 | 9.2083 | 56.9726 |
|---|
| 19 | 119 | 1.5873 | 0.1038 | 0.6269 | 7.0431 | 0.0152 | -0.4980 | 0.9697 | 8.9667 |
|---|
| 20 | 120 | 1.7004 | 0.0756 | 0.5270 | 7.7486 | 0.3987 | -0.1081 | 15.1174 | 85.4669 |
|---|
| 21 | 121 | 0.7272 | 0.0667 | 0.1301 | 4.1187 | 0.3379 | -0.2399 | 12.9073 | 75.6418 |
|---|
| 22 | 122 | 1.5857 | 0.1039 | 0.6134 | 6.9632 | 4.2555e-05 | -0.0063 | 1.4398e-07 | 0.0060 |
|---|
| 23 | 124 | 0.6494 | 0.0921 | 0.1190 | 3.7590 | 0.2860 | 0.0226 | 13.8195 | 80.4151 |
|---|
| 24 | 125 | 1.3794 | 0.1255 | 0.0020 | 2.2476 | 0.0256 | 0.4988 | 1.8740 | 15.0043 |
|---|
| 25 | 126 | 1.1792 | 0.7890 | 0.9054 | 4.1753 | 0.5000 | -0.2500 | 12.0000 | 71.9789 |
|---|
| 26 | 127 | 0.6693 | 1.2087 | 1.0936 | 3.9910 | 0.5000 | -0.0758 | 16.1212 | 89.9744 |
|---|
| 27 | 128 | 1.5890 | 0.1050 | 0.6162 | 7.0437 | 0.0455 | -0.4993 | 2.7273 | 20.1103 |
|---|
| 28 | 129 | 0.9742 | 0.7686 | 2.0000 | 10.0000 | 0.2698 | -0.2837 | 13.4410 | 77.5883 |
|---|
| 29 | 130 | 1.5952 | 0.0995 | 0.5786 | 6.5340 | 0.0157 | -0.4981 | 0.9688 | 8.9041 |
|---|
| 30 | 131 | 1.6493 | 0.0740 | 0.4094 | 7.7647 | 0.0909 | -0.1667 | 5.3939 | 39.4657 |
|---|
| 31 | 132 | 0.4169 | 1.4241 | 1.0976 | 4.4155 | 0.1212 | -0.3750 | 6.4848 | 43.0681 |
|---|
| 32 | 133 | 0.7084 | 1.2033 | 4.5178e-06 | 0.2328 | 0.2110 | -0.3937 | 13.8225 | 80.5423 |
|---|
| 33 | 134 | 0.7260 | 1.2339 | 1.0553 | 4.0047 | 0.5000 | -0.0846 | 15.7846 | 88.2386 |
|---|
| 34 | 135 | 0.7442 | 0.5342 | 2.0000 | 10.0000 | 0.3065 | -0.2433 | 14.0144 | 80.4176 |
|---|
| 35 | 136 | 1.6665 | 0.0714 | 0.5478 | 7.7469 | 0.1061 | -0.3571 | 5.8788 | 40.2556 |
|---|
| 36 | 138 | 0.4429 | 1.4670 | 1.2319 | 4.0619 | 0.3571 | -0.2000 | 12.2143 | 70.6853 |
|---|
| 37 | 140 | 1.5016 | 1.5422 | 1.9997 | 9.9987 | 0.3678 | -0.0058 | 15.1377 | 85.6076 |
|---|
| 38 | 141 | 1.9988 | 1.7130 | 1.9997 | 9.9990 | 0.4157 | -0.1011 | 15.3337 | 85.6393 |
|---|
| 39 | 142 | 0.7724 | 1.1749 | 1.0531 | 4.0679 | 0.5000 | -0.0606 | 16.2576 | 90.5233 |
|---|
| 40 | 143 | 0.5852 | 1.3712 | 1.1157 | 4.0080 | 0.5000 | -0.1154 | 15.3846 | 86.6162 |
|---|
| 41 | 144 | 1.5852 | 0.1032 | 0.6237 | 6.9778 | 0.0162 | -0.4981 | 0.9677 | 8.8396 |
|---|
| 42 | 145 | 0.7559 | 1.1949 | 0.9763 | 4.1178 | 0.5000 | -0.1290 | 14.4677 | 81.7741 |
|---|
| 43 | 146 | 0.7759 | 1.1789 | 1.0428 | 4.0385 | 0.5000 | -0.1364 | 15.2727 | 86.5236 |
|---|
| 44 | 147 | 0.6260 | 1.2515 | 1.1870 | 4.5035 | 4.8527e-05 | 0.0293 | 1.6811e-07 | 0.0066 |
|---|
| 45 | 148 | 1.6086 | 0.1035 | 0.6450 | 6.8864 | 0.0909 | -0.4997 | 4.9091 | 31.2975 |
|---|
| 46 | 149 | 1.9997 | 0.3635 | 5.5472e-04 | 9.9986 | 2.8567e-05 | -0.3139 | 0.5013 | 2.8825 |
|---|
| 47 | 150 | 0.7585 | 0.5801 | 2.0000 | 9.9999 | 0.2887 | -0.3189 | 12.4394 | 72.0456 |
|---|
| 48 | 151 | 1.6112 | 0.0691 | 0.4543 | 8.1068 | 0.2214 | -0.1774 | 9.7286 | 59.6184 |
|---|
| 49 | 152 | 0.9136 | 5.8932e-07 | 1.5662 | 7.0897 | 0.1727 | -0.1140 | 10.3596 | 64.9554 |
|---|
| 50 | 153 | 0.5963 | 1.3784 | 1.0256 | 5.9577 | 0.4091 | 0.0556 | 15.8182 | 88.7364 |
|---|
| 51 | 154 | 1.6822 | 0.0810 | 0.4812 | 7.8066 | 0.0303 | -1.1914e-07 | 1.9394 | 17.9248 |
|---|
| 52 | 155 | 2.0000 | 0.3627 | 3.8484e-05 | 9.9999 | 0.0469 | 0.1775 | 3.3472 | 26.6615 |
|---|
| 53 | 156 | 0.6292 | 1.2547 | 1.0216e-05 | 1.6516 | 0.0310 | -0.3874 | 2.7664 | 21.6691 |
|---|
| 54 | 157 | 0.5212 | 2.1390e-06 | 1.2895 | 1.3913 | 0.1660 | -0.5000 | 13.2897 | 77.3106 |
|---|
| 55 | 158 | 1.5612 | 1.9993 | 1.9993 | 9.9982 | 0.4772 | -0.0664 | 16.2239 | 90.3865 |
|---|
| 56 | 160 | 0.3847 | 1.4131 | 1.0644 | 4.1958 | 0.1534 | 0.0926 | 8.4091 | 55.4065 |
|---|
| 57 | 161 | 0.4322 | 1.3700 | 0.8757 | 4.6777 | 0.4129 | -0.1972 | 14.0606 | 80.9767 |
|---|
2.2 Compute Model Fit Index
We will calculate BIC as the model fit index because we are attempting to model the probabilistic nature of the data generation process.
subjectData.BIC = subjectData.Deviance + log(65) * 6;
2.3 Identify the Best Model
We need to define new objective functions for each model. Since each model uses the same utility function, but holds some variables constant (at 0), we only really need to modify the number of parameter inputs and set the constant values to 0. We can also use a list of indices so that the we can use the same function to generate predictions down the line.
function f = of_alphaOnly(params, df, optimMethod)
params_new = zeros(1, 6);
params_new([1, 4:6]) = params;
Prob1 = generatePredictions(params_new, df);
if strcmp(optimMethod, 'OLS')
f = sum((Chose1 - Prob1).^2);
elseif strcmp(optimMethod, 'MLE')
f = -sum(Chose1 .* log(Prob1) + (1 - Chose1) .* log(1 - Prob1));
function f = of_deltaOnly(params, df, optimMethod)
params_new = zeros(1, 6);
params_new([2, 4:6]) = params;
Prob1 = generatePredictions(params_new, df);
if strcmp(optimMethod, 'OLS')
f = sum((Chose1 - Prob1).^2);
elseif strcmp(optimMethod, 'MLE')
f = -sum(Chose1 .* log(Prob1) + (1 - Chose1) .* log(1 - Prob1));
function f = of_rhoOnly(params, df, optimMethod)
params_new = zeros(1, 6);
params_new(3:6) = params;
Prob1 = generatePredictions(params_new, df);
if strcmp(optimMethod, 'OLS')
f = sum((Chose1 - Prob1).^2);
elseif strcmp(optimMethod, 'MLE')
f = -sum(Chose1 .* log(Prob1) + (1 - Chose1) .* log(1 - Prob1));
function f = of_ad(params, df, optimMethod)
params_new = zeros(1, 6);
params_new([1:2, 4:6]) = params;
Prob1 = generatePredictions(params_new, df);
if strcmp(optimMethod, 'OLS')
f = sum((Chose1 - Prob1).^2);
elseif strcmp(optimMethod, 'MLE')
f = -sum(Chose1 .* log(Prob1) + (1 - Chose1) .* log(1 - Prob1));
function f = of_ar(params, df, optimMethod)
params_new = zeros(1, 6);
params_new([1, 3:6]) = params;
Prob1 = generatePredictions(params_new, df);
if strcmp(optimMethod, 'OLS')
f = sum((Chose1 - Prob1).^2);
elseif strcmp(optimMethod, 'MLE')
f = -sum(Chose1 .* log(Prob1) + (1 - Chose1) .* log(1 - Prob1));
function f = of_dr(params, df, optimMethod)
params_new = zeros(1, 6);
params_new(2:6) = params;
Prob1 = generatePredictions(params_new, df);
if strcmp(optimMethod, 'OLS')
f = sum((Chose1 - Prob1).^2);
elseif strcmp(optimMethod, 'MLE')
f = -sum(Chose1 .* log(Prob1) + (1 - Chose1) .* log(1 - Prob1));
function f = of_noEpsilon(params, df, optimMethod)
params_new = zeros(1, 6);
params_new(1:4) = params;
Prob1 = generatePredictions(params_new, df);
if strcmp(optimMethod, 'OLS')
f = sum((Chose1 - Prob1).^2);
elseif strcmp(optimMethod, 'MLE')
f = -sum(Chose1 .* log(Prob1) + (1 - Chose1) .* log(1 - Prob1));
function f = of_noGamma(params, df, optimMethod)
params_new = zeros(1, 6);
params_new(1:5) = params;
Prob1 = generatePredictions(params_new, df);
if strcmp(optimMethod, 'OLS')
f = sum((Chose1 - Prob1).^2);
elseif strcmp(optimMethod, 'MLE')
f = -sum(Chose1 .* log(Prob1) + (1 - Chose1) .* log(1 - Prob1));
function f = of_GammaOnly(params, df, optimMethod)
params_new = zeros(1, 6);
Prob1 = generatePredictions(params_new, df);
if strcmp(optimMethod, 'OLS')
f = sum((Chose1 - Prob1).^2);
elseif strcmp(optimMethod, 'MLE')
f = -sum(Chose1 .* log(Prob1) + (1 - Chose1) .* log(1 - Prob1));
ofs = {@of_alphaOnly, @of_deltaOnly, @of_rhoOnly, @of_ad, @of_ar, @of_dr, @of_noEpsilon, @of_noGamma, @of_GammaOnly};
idxs = {[1, 4:6], [2, 4:6], 3:6, [1:2, 4:6], [1, 3:6], 2:6, 1:4, 1:5, 6};
And now we can preallocate the predictions for each model and the new data frame
altSubjectData = table();
altTrialData = trialData;
altTrialData.Prob1 = []; % Remove the 9th column
altTrialData.alphaOnly_Prob1 = zeros(height(altTrialData), 1);
altTrialData.deltaOnly_Prob1 = zeros(height(altTrialData), 1);
altTrialData.rhoOnly_Prob1 = zeros(height(altTrialData), 1);
altTrialData.ad_Prob1 = zeros(height(altTrialData), 1);
altTrialData.ar_Prob1 = zeros(height(altTrialData), 1);
altTrialData.dr_Prob1 = zeros(height(altTrialData), 1);
altTrialData.noEpsilon_Prob1 = zeros(height(altTrialData), 1);
altTrialData.noGamma_Prob1 = zeros(height(altTrialData), 1);
altTrialData.gammaOnly_Prob1 = zeros(height(altTrialData), 1);
Now let's Recover Free Parameters and Generate Predictions for this Model
for i = 1:length(included_subjects)
df = grab_data(included_subjects(i));
numParams = sum(cellfun(@length, idxs)) + 2 * 9;
outputs = zeros(1, numParams);
initials = initial_params(idx);
uppers = upper_bounds(idx);
lowers = lower_bounds(idx);
result = optimize(of, initials, lowers, uppers, df);
df.Prob1 = generatePredictions(pars, df);
model_SS = sum((df.Chose1 - df.Prob1).^2);
model_NLL = -2 * sum(df.Chose1 .* log(df.Prob1) + (1 - df.Chose1) .* log(1 - df.Prob1));
outputs((j+1):(j+2+length(result))) = [result, model_SS, model_NLL];
j = j + 2 + length(result);
altTrialData{altTrialData.SubjectID == included_subjects(i), (8+k)} = df.Prob1;
altSubjectData(i, 1:56) = array2table([included_subjects(i), outputs]);
altSubjectData.Properties.VariableNames = {'SubjectID', 'Alpha_M1', 'Beta_M1', 'Epsilon_M1', 'Gamma_M1', 'SS_M1', 'Deviance_M1', ...
'Delta_M2', 'Beta_M2', 'Epsilon_M2', 'Gamma_M2', 'SS_M2', 'Deviance_M2', ...
'Rho_M3', 'Beta_M3', 'Epsilon_M3', 'Gamma_M3', 'SS_M3', 'Deviance_M3', ...
'Alpha_M4', 'Delta_M4', 'Beta_M4', 'Epsilon_M4', 'Gamma_M4', 'SS_M4', 'Deviance_M4', ...
'Alpha_M5', 'Rho_M5', 'Beta_M5', 'Epsilon_M5', 'Gamma_M5', 'SS_M5', 'Deviance_M5', ...
'Delta_M6', 'Rho_M6', 'Beta_M6', 'Epsilon_M6', 'Gamma_M6', 'SS_M6', 'Deviance_M6', ...
'Alpha_M7', 'Delta_M7', 'Rho_M7', 'Beta_M7', 'SS_M7', 'Deviance_M7', ...
'Alpha_M8', 'Delta_M8', 'Rho_M8', 'Beta_M8', 'Epsilon_M8', 'SS_M8', 'Deviance_M8', ...
'Gamma_M9', 'SS_M9', 'Deviance_M9'};
altSubjectData(:, 2:end) = varfun(@double, altSubjectData(:, 2:end))
altSubjectData = 57×56 table
| | SubjectID | Alpha_M1 | Beta_M1 | Epsilon_M1 | Gamma_M1 | SS_M1 | Deviance_M1 | Delta_M2 | Beta_M2 | Epsilon_M2 | Gamma_M2 | SS_M2 | Deviance_M2 | Rho_M3 | Beta_M3 | Epsilon_M3 | Gamma_M3 | SS_M3 | Deviance_M3 | Alpha_M4 | Delta_M4 | Beta_M4 | Epsilon_M4 | Gamma_M4 | SS_M4 | Deviance_M4 | Alpha_M5 | Rho_M5 | Beta_M5 | Epsilon_M5 | Gamma_M5 | SS_M5 | Deviance_M5 | Delta_M6 | Rho_M6 | Beta_M6 | Epsilon_M6 | Gamma_M6 | SS_M6 | Deviance_M6 | Alpha_M7 | Delta_M7 | Rho_M7 | Beta_M7 | SS_M7 | Deviance_M7 | Alpha_M8 | Delta_M8 | Rho_M8 | Beta_M8 | Epsilon_M8 | SS_M8 | Deviance_M8 | Gamma_M9 | SS_M9 | Deviance_M9 |
|---|
| 1 | 101 | 0.9996 | 4.0204 | 0.5000 | -0.0455 | 16.3636 | 90.9492 | 1.1755 | 4.0766 | 0.4394 | -0.0517 | 16.1212 | 89.9689 | 0.9239 | 3.9647 | 0.4362 | -0.0521 | 16.1212 | 89.9689 | 0.2345 | 1.4538 | 4.3429 | 0.4394 | -0.0517 | 16.1212 | 89.9689 | 0.9876 | 1.0059 | 4.0166 | 0.5000 | -0.0455 | 16.3636 | 90.9492 | 1.0000 | 1.0000 | 4.0184 | 0.4394 | -0.0517 | 16.1212 | 89.9689 | 5.2475e-07 | 2.4427e-07 | 1.3479 | 0.1807 | 16.2576 | 90.5233 | 0.4412 | 1.4597 | 1.0691 | 4.1116 | 0.4394 | 16.2576 | 90.5233 | -0.0455 | 16.5000 | 91.4954 |
|---|
| 2 | 102 | 1.6295 | 4.3651 | 0.1364 | -0.2778 | 7.3939 | 49.1954 | 1.0000 | 4.0381 | 0.5000 | -0.0758 | 16.1212 | 89.9744 | 0.1916 | 3.5848 | 0.5000 | -0.0758 | 16.1212 | 89.9744 | 1.1513 | 0.1651 | 3.1218 | 0.1256 | -0.3444 | 7.3523 | 48.5729 | 1.4784 | 0.4183 | 5.1971 | 0.1364 | -0.2778 | 7.3939 | 49.1954 | 1.0000 | 1.0000 | 4.0381 | 0.5000 | -0.0758 | 16.1212 | 89.9744 | 1.3839 | 0.1870 | 4.5093e-04 | 2.1135 | 8.8712 | 284.2021 | 1.6039 | 0.0858 | 0.3961 | 7.6261 | 0.1364 | 7.7727 | 52.5765 | -0.0758 | 16.5000 | 91.4954 |
|---|
| 3 | 103 | 1.3698 | 4.1702 | 0.0157 | -0.4979 | 0.9688 | 8.9047 | 0.9999 | 4.0269 | 0.5000 | -0.0231 | 16.2154 | 89.9706 | 0.1024 | 3.4158 | 0.5000 | -0.0231 | 16.2154 | 89.9706 | 1.4862 | 0.1328 | 6.6532 | 0.0157 | -0.4981 | 0.9688 | 8.9041 | 1.3381 | 0.8134 | 6.3171 | 0.0157 | -0.4978 | 0.9688 | 8.9046 | 1.0000 | 1.0000 | 4.0269 | 0.5000 | -0.0231 | 16.2154 | 89.9706 | 1.5480 | 0.0728 | 0.4774 | 5.0098 | 1.0000 | 46.0531 | 1.6714 | 0.0792 | 0.5377 | 7.7496 | 0.0154 | 0.9846 | 10.3333 | -0.0231 | 16.2500 | 90.1091 |
|---|
| 4 | 104 | 0.5114 | 2.7783 | 3.2654e-06 | -0.1373 | 0.9447 | 7.8535 | 0.9999 | 4.0163 | 0.5000 | 0.0152 | 16.4849 | 91.4348 | 0.0237 | 1.4694 | 0.4974 | 0.0152 | 16.4877 | 91.4464 | 1.5914 | 0.0989 | 7.5094 | 0.0152 | 0.4980 | 0.9697 | 8.9665 | 0.8776 | 1.0586 | 4.0682 | 5.0665e-05 | 0.0521 | 0.9444 | 7.7305 | 1.0000 | 1.0000 | 4.0163 | 0.5000 | 0.0152 | 16.4849 | 91.4348 | 1.1663 | 0.0063 | 0.9174 | 2.0908 | 0.9486 | 7.8216 | 1.6418 | 0.0691 | 0.5621 | 7.5030 | 0.0152 | 0.9848 | 10.3641 | 0.0152 | 16.5000 | 91.4954 |
|---|
| 5 | 105 | 0.9905 | 4.0225 | 0.5000 | -0.0312 | 15.9375 | 88.4727 | 0.1820 | 4.0823 | 0.0208 | -0.5000 | 1.8006 | 14.8022 | 1.5068 | 7.5555 | 0.0313 | -0.4991 | 1.8750 | 14.9671 | 4.0444e-06 | 1.0325 | 0.7197 | 0.0208 | -0.5000 | 1.8006 | 14.8022 | 0.0278 | 1.8204 | 9.0139 | 0.0313 | -0.4990 | 1.8750 | 14.9667 | 0.2037 | 3.2526e-06 | 3.6486 | 0.0208 | -0.5000 | 1.8006 | 14.8022 | 1.8653e-06 | 0.7461 | 1.0950e-05 | 0.8598 | 1.8414 | 18.8393 | 0.4795 | 1.3870 | 1.1075 | 4.4181 | 0.0313 | 1.9375 | 17.7998 | -0.0312 | 16 | 88.7228 |
|---|
| 6 | 106 | 1.3212 | 6.6075 | 0.1529 | -0.1935 | 8.2339 | 54.0180 | 1.0000 | 4.0260 | 0.5000 | -0.0538 | 16.0615 | 89.3538 | 8.4391e-04 | 3.3499 | 0.4994 | -0.0539 | 16.0616 | 89.3540 | 1.6109 | 0.0986 | 7.5899 | 0.1529 | -0.1935 | 8.2339 | 54.0180 | 1.6153 | 0.6099 | 8.0110 | 0.1529 | -0.1935 | 8.2339 | 54.0180 | 1.0000 | 1.0000 | 4.0260 | 0.5000 | -0.0538 | 16.0615 | 89.3538 | 1.6502 | 0.0855 | 0.5905 | 4.7771 | 10.0000 | 473.7453 | 1.7022 | 0.0740 | 0.5134 | 8.0040 | 0.1538 | 8.4615 | 55.8120 | -0.0538 | 16.2500 | 90.1091 |
|---|
| 7 | 107 | 0.9928 | 4.0249 | 0.5000 | -0.0455 | 16.3636 | 90.9492 | 1.3787 | 4.2427 | 0.2273 | -0.1000 | 11.4545 | 69.9661 | 1.0175 | 4.0378 | 0.2182 | -0.1042 | 11.4545 | 69.9661 | 0.2827 | 1.6190 | 4.0534 | 0.2273 | -0.1000 | 11.4545 | 69.9661 | 0.0171 | 1.7208 | 8.2653 | 0.2273 | -0.1000 | 11.4545 | 69.9661 | 1.1538 | 1.0513 | 4.1527 | 0.2273 | -0.1000 | 11.4545 | 69.9661 | 5.0894e-08 | 3.1019e-07 | 0.3182 | 3.8459 | 11.5909 | 70.7467 | 0.3709 | 1.3523 | 1.2430 | 4.2690 | 0.2273 | 11.5909 | 70.7467 | -0.0455 | 16.5000 | 91.4954 |
|---|
| 8 | 108 | 1.0011 | 4.0623 | 0.5000 | -0.1061 | 15.7576 | 88.5031 | 0.0117 | 3.6371 | 0.1054 | -0.4997 | 15.5243 | 87.5555 | 0.9529 | 4.7601 | 0.4690 | -0.1131 | 15.6970 | 88.2490 | 0.7953 | 0.6573 | 1.3950 | 0.3935 | -0.2040 | 15.4957 | 87.4619 | 1.0139 | 0.9928 | 4.0136 | 0.5000 | -0.1061 | 15.7576 | 88.5031 | 1.0000 | 1.0000 | 4.0247 | 0.4697 | -0.1129 | 15.6970 | 88.2490 | 2.0441e-06 | 1.3345 | 1.0884e-05 | 0.0257 | 16.2573 | 90.5192 | 1.4188 | 0.2956 | 0.8211 | 4.1554 | 0.5000 | 16.5000 | 91.4954 | -0.1061 | 16.5000 | 91.4954 |
|---|
| 9 | 109 | 0.9849 | 4.0662 | 0.5000 | -0.1875 | 13.7500 | 79.4991 | 1.1672 | 4.0760 | 0.3437 | -0.2727 | 12.1875 | 71.9739 | 0.9957 | 4.9879 | 0.3416 | -0.2745 | 12.1875 | 71.9739 | 0.3395 | 1.6147 | 4.4653 | 0.3437 | -0.2727 | 12.1875 | 71.9739 | 0.2387 | 1.9999 | 9.9996 | 0.3209 | -0.3277 | 12.1053 | 71.3591 | 1.0000 | 1.0000 | 4.0047 | 0.3437 | -0.2727 | 12.1875 | 71.9739 | 6.3150e-09 | 0.0095 | 0.0575 | 5.2776 | 14.4047 | 82.2280 | 0.2892 | 1.3199 | 1.0511 | 4.1861 | 0.3437 | 14.4375 | 82.3669 | -0.1875 | 16 | 88.7228 |
|---|
| 10 | 110 | 0.9897 | 4.0158 | 0.5000 | -0.0645 | 15.2419 | 84.9151 | 0.2642 | 3.0256 | 0.0577 | -0.5000 | 3.4388 | 24.0005 | 1.4975 | 7.4873 | 0.0645 | -0.4995 | 3.4839 | 23.8460 | 5.4358e-06 | 1.0350 | 0.7723 | 0.0577 | -0.5000 | 3.4388 | 24.0005 | 0.0294 | 1.8167 | 8.9877 | 0.0645 | -0.4995 | 3.4839 | 23.8457 | 1.0189 | 1.3708e-05 | 0.7845 | 0.0577 | -0.5000 | 3.4388 | 24.0005 | 4.1573e-07 | 0.5517 | 3.1257e-05 | 0.9153 | 3.7325 | 32.3045 | 0.5595 | 1.3154 | 1.0718 | 5.8812 | 0.0645 | 3.7419 | 29.6629 | -0.0645 | 15.5000 | 85.9503 |
|---|
| 11 | 111 | 1.1149 | 5.5743 | 5.4500e-05 | -0.0034 | 2.0413e-07 | 0.0073 | 0.9999 | 3.9821 | 0.5000 | 1.4217e-08 | 16.5000 | 91.4954 | 0.0343 | 3.4461 | 0.4991 | -7.3164e-06 | 16.5037 | 91.5102 | 1.4907 | 0.1352 | 6.6739 | 4.2872e-05 | -0.0111 | 1.3984e-07 | 0.0060 | 1.3317 | 0.8195 | 6.2815 | 4.7994e-05 | -0.0087 | 1.6948e-07 | 0.0067 | 1.0000 | 1.0000 | 3.9900 | 0.5000 | 3.4442e-09 | 16.5000 | 91.4954 | 1.5810 | 0.0989 | 0.6268 | 6.4028 | 2.4791e-08 | 8.0190e-04 | 1.5837 | 0.1030 | 0.6234 | 6.9733 | 4.2490e-05 | 1.4420e-07 | 0.0060 | 0 | 16.5000 | 91.4954 |
|---|
| 12 | 112 | 0.9910 | 4.0148 | 0.5000 | 0.0231 | 16.2154 | 89.9706 | 0.0897 | 1.2421 | 0.0269 | 0.4998 | 14.1811 | 81.5138 | 0.9407 | 3.9962 | 0.3303 | 0.0387 | 14.5114 | 83.0113 | 1.6265 | 1.6633 | 9.4870 | 0.2840 | -0.1296 | 13.9254 | 80.4270 | 0.2530 | 1.9999 | 9.9996 | 0.2865 | -0.0453 | 14.0606 | 81.0658 | 0.2032 | 1.3894e-07 | 0.5482 | 0.0269 | 0.5000 | 14.1811 | 81.5138 | 0.7154 | 1.1587 | 0.1662 | 0.1540 | 13.9814 | 80.7336 | 1.2611 | 1.3343 | 4.5506e-06 | 6.4642 | 0.3197 | 14.1964 | 81.7087 | 0.0231 | 16.2500 | 90.1091 |
|---|
| 13 | 113 | 1.1149 | 5.5743 | 5.4500e-05 | -0.0034 | 2.0413e-07 | 0.0073 | 0.9999 | 3.9821 | 0.5000 | 1.4217e-08 | 16.5000 | 91.4954 | 0.0343 | 3.4461 | 0.4991 | -7.3164e-06 | 16.5037 | 91.5102 | 1.4907 | 0.1352 | 6.6739 | 4.2872e-05 | -0.0111 | 1.3984e-07 | 0.0060 | 1.3317 | 0.8195 | 6.2815 | 4.7994e-05 | -0.0087 | 1.6948e-07 | 0.0067 | 1.0000 | 1.0000 | 3.9900 | 0.5000 | 3.4442e-09 | 16.5000 | 91.4954 | 1.5810 | 0.0989 | 0.6268 | 6.4028 | 2.4791e-08 | 8.0190e-04 | 1.5837 | 0.1030 | 0.6234 | 6.9733 | 4.2490e-05 | 1.4420e-07 | 0.0060 | 0 | 16.5000 | 91.4954 |
|---|
| 14 | 114 | 0.3343 | 3.8320 | 0.1835 | -0.3244 | 9.6812 | 60.3935 | 1.0000 | 4.0875 | 0.5000 | -0.1061 | 15.7576 | 88.5031 | 1.8101e-07 | 2.6161 | 0.2888 | -0.1836 | 15.7576 | 88.5031 | 1.0794 | 0.1215 | 1.9524 | 0.1824 | -0.3305 | 9.6603 | 60.1663 | 0.9123 | 0.5666 | 2.0255 | 0.1841 | -0.3225 | 9.6807 | 60.3904 | 1.0000 | 1.0000 | 4.0875 | 0.5000 | -0.1061 | 15.7576 | 88.5031 | 1.3377 | 5.1408e-07 | 3.8893e-06 | 0.2297 | 12.4504 | 79.7345 | 1.7054 | 0.0704 | 0.5766 | 7.6246 | 0.1970 | 10.4394 | 65.4948 | -0.1061 | 16.5000 | 91.4954 |
|---|
| 15 | 115 | 1.1171 | 5.5851 | 0.1364 | -0.4997 | 6.5455 | 38.6775 | 1.0000 | 4.1159 | 0.5000 | -0.1364 | 15.2727 | 86.5236 | 3.6313e-07 | 5.4907 | 0.4574 | -0.1491 | 15.2727 | 86.5236 | 1.4749 | 0.1344 | 6.6421 | 0.1364 | -0.4998 | 6.5455 | 38.6771 | 1.3010 | 0.6052 | 5.6658 | 0.1364 | -0.4998 | 6.5455 | 38.6776 | 1.0000 | 1.0000 | 4.1160 | 0.5000 | -0.1364 | 15.2727 | 86.5236 | 1.5882 | 0.0986 | 0.6450 | 6.5117 | 9.0000 | 414.4660 | 1.6860 | 0.0700 | 0.5464 | 8.0086 | 0.1364 | 7.7727 | 52.5765 | -0.1364 | 16.5000 | 91.4954 |
|---|
| 16 | 116 | 1.1226 | 5.6129 | 0.0909 | -0.4996 | 4.9091 | 31.2978 | 1.0000 | 4.0818 | 0.5000 | -0.0909 | 15.9545 | 89.3014 | 5.8531e-04 | 3.0748 | 0.4961 | -0.0916 | 15.9547 | 89.3022 | 1.4751 | 0.1327 | 6.6471 | 0.0909 | -0.4997 | 4.9091 | 31.2974 | 1.2990 | 0.6642 | 6.0169 | 0.0909 | -0.4997 | 4.9091 | 31.2974 | 1.0000 | 1.0000 | 4.0819 | 0.5000 | -0.0909 | 15.9545 | 89.3014 | 1.5460 | 0.0611 | 0.3607 | 4.4132 | 6.0000 | 276.3120 | 1.7056 | 0.0762 | 0.5378 | 7.9671 | 0.0909 | 5.4545 | 40.2120 | -0.0909 | 16.5000 | 91.4954 |
|---|
| 17 | 117 | 0.9932 | 4.0152 | 0.5000 | -0.0231 | 16.2154 | 89.9706 | 1.3671 | 4.2384 | 0.2457 | -0.0549 | 12.0142 | 72.2934 | 0.9825 | 4.0150 | 0.2357 | -0.0573 | 12.0142 | 72.2934 | 0.2464 | 1.3479 | 5.0595 | 0.2457 | -0.0549 | 12.0142 | 72.2934 | 0.2528 | 2.0000 | 10.0000 | 0.1623 | -0.3398 | 10.9276 | 65.6919 | 1.0933 | 1.0334 | 4.1026 | 0.2457 | -0.0549 | 12.0142 | 72.2934 | 4.5155e-07 | 4.2333e-07 | 1.3526 | 0.8274 | 12.0615 | 72.5491 | 0.4163 | 1.4602 | 1.0436 | 4.1510 | 0.2462 | 12.0615 | 72.5491 | -0.0231 | 16.2500 | 90.1091 |
|---|
| 18 | 118 | 1.6875 | 4.2575 | 0.1979 | -0.3421 | 9.2083 | 56.9726 | 1.0000 | 4.0972 | 0.5000 | -0.1308 | 15.1385 | 85.6109 | 1.1343e-07 | 3.1783 | 0.2174 | -0.3008 | 15.1385 | 85.6109 | 1.5912 | 0.0937 | 7.4290 | 0.1979 | -0.3421 | 9.2083 | 56.9726 | 1.4923 | 0.7512 | 6.9552 | 0.1979 | -0.3421 | 9.2083 | 56.9726 | 1.0000 | 1.0000 | 4.0972 | 0.5000 | -0.1308 | 15.1385 | 85.6109 | 1.3512 | 4.1967e-07 | 1.7084e-05 | 0.9702 | 12.9343 | 233.4483 | 1.6951 | 0.0689 | 0.4863 | 7.9353 | 0.2000 | 10.4000 | 65.0523 | -0.1308 | 16.2500 | 90.1091 |
|---|
| 19 | 119 | 1.3472 | 4.1692 | 0.0152 | -0.4977 | 0.9697 | 8.9674 | 0.9999 | 4.0163 | 0.5000 | -0.0152 | 16.4849 | 91.4348 | 0.0230 | 1.4873 | 0.4974 | -0.0152 | 16.4877 | 91.4464 | 1.4934 | 0.1351 | 6.6885 | 0.0152 | -0.4980 | 0.9697 | 8.9667 | 1.6010 | 0.6762 | 4.2036 | 0.0152 | -0.4979 | 0.9697 | 8.9674 | 1.0000 | 1.0000 | 4.0163 | 0.5000 | -0.0152 | 16.4849 | 91.4348 | 1.5666 | 0.0883 | 0.4864 | 5.5850 | 1.0000 | 46.0526 | 1.6605 | 0.0813 | 0.5087 | 7.7868 | 0.0152 | 0.9848 | 10.3641 | -0.0152 | 16.5000 | 91.4954 |
|---|
| 20 | 120 | 1.7779 | 4.2365 | 0.3987 | -0.1081 | 15.1174 | 85.4669 | 1.0000 | 4.0899 | 0.5000 | -0.0846 | 15.7846 | 88.2386 | 0.0417 | 3.5751 | 0.4998 | -0.0847 | 15.7848 | 88.2394 | 1.2502 | 0.7677 | 3.8669 | 0.5000 | -0.0846 | 15.7846 | 88.2386 | 1.7563 | 0.2568 | 4.8530 | 0.3987 | -0.1081 | 15.1174 | 85.4669 | 1.0000 | 1.0000 | 4.0899 | 0.5000 | -0.0846 | 15.7846 | 88.2386 | 1.0434 | 0.5297 | 0.5534 | 3.3489e-08 | 16.2500 | 90.1091 | 1.2389 | 0.6792 | 0.9815 | 3.7933 | 0.5000 | 16.2500 | 90.1091 | -0.0846 | 16.2500 | 90.1091 |
|---|
| 21 | 121 | 1.5760 | 4.2096 | 0.3409 | -0.2333 | 12.9091 | 75.6546 | 1.0000 | 4.0593 | 0.5000 | -0.1615 | 14.5538 | 83.2013 | 5.2627e-08 | 3.5598 | 0.2926 | -0.2760 | 14.5538 | 83.2013 | 1.1221 | 0.1133 | 2.3905 | 0.3376 | -0.2405 | 12.9073 | 75.6417 | 1.5335 | 0.8103 | 6.9186 | 0.3409 | -0.2333 | 12.9091 | 75.6546 | 1.0000 | 1.0000 | 4.0594 | 0.5000 | -0.1615 | 14.5538 | 83.2013 | 0.0278 | 8.1461e-09 | 6.3605e-08 | 3.1028 | 15.4781 | 87.0211 | 0.3418 | 1.6116 | 1.0474 | 4.0246 | 0.5000 | 16.2500 | 90.1091 | -0.1615 | 16.2500 | 90.1091 |
|---|
| 22 | 122 | 1.1149 | 5.5742 | 5.4500e-05 | -0.0034 | 2.0413e-07 | 0.0073 | 0.9999 | 3.9821 | 0.5000 | 1.3140e-08 | 16.5000 | 91.4954 | 0.0342 | 3.4454 | 0.4990 | -7.7564e-06 | 16.5037 | 91.5102 | 1.4906 | 0.1352 | 6.6738 | 4.2872e-05 | -0.0111 | 1.3984e-07 | 0.0060 | 1.3320 | 0.8196 | 6.2799 | 4.7989e-05 | -0.0084 | 1.6945e-07 | 0.0067 | 1.0000 | 1.0000 | 3.9891 | 0.5000 | 9.3726e-09 | 16.5000 | 91.4955 | 1.5810 | 0.0989 | 0.6268 | 6.4028 | 2.4791e-08 | 8.0190e-04 | 1.5847 | 0.1025 | 0.6306 | 6.9747 | 4.2481e-05 | 1.4425e-07 | 0.0060 | 0 | 16.5000 | 91.4954 |
|---|
| 23 | 124 | 1.2497 | 4.0880 | 0.3030 | 0.0500 | 13.8788 | 80.6828 | 1.0000 | 4.0340 | 0.5000 | 0.0303 | 16.4394 | 91.2529 | 9.8594e-04 | 3.4509 | 0.4991 | 0.0304 | 16.4394 | 91.2530 | 1.0722 | 0.1672 | 2.0452 | 0.2855 | 0.0222 | 13.8194 | 80.4151 | 1.4059 | 0.7961 | 4.0925 | 0.3030 | 0.0500 | 13.8788 | 80.6828 | 1.0000 | 1.0000 | 4.0341 | 0.5000 | 0.0303 | 16.4394 | 91.2529 | 1.3689 | 5.2492e-06 | 4.0752e-06 | 0.1207 | 13.9532 | 80.9403 | 1.3481 | 0.2199 | 2.5344e-04 | 1.4982 | 0.2815 | 13.8267 | 80.4517 | 0.0303 | 16.5000 | 91.4954 |
|---|
| 24 | 125 | 1.6626 | 4.4897 | 0.0303 | 0.4990 | 1.8788 | 15.0938 | 0.9999 | 4.0267 | 0.5000 | 0.0303 | 16.4394 | 91.2529 | 0.1019 | 3.4158 | 0.5000 | 0.0303 | 16.4394 | 91.2529 | 1.1292 | 0.1028 | 2.7437 | 0.0256 | 0.4988 | 1.8740 | 15.0043 | 1.4756 | 0.7350 | 7.0459 | 0.0303 | 0.4989 | 1.8788 | 15.0943 | 1.0000 | 1.0000 | 4.0267 | 0.5000 | 0.0303 | 16.4394 | 91.2529 | 0.6230 | 0.0512 | 0.1259 | 3.7366 | 1.9210 | 21.4511 | 1.3595 | 0.1236 | 1.5615e-04 | 1.6898 | 0.0178 | 1.9047 | 17.0056 | 0.0303 | 16.5000 | 91.4954 |
|---|
| 25 | 126 | 0.3272 | 3.8701 | 0.3935 | -0.3193 | 11.3352 | 68.3571 | 1.0000 | 4.0769 | 0.5000 | -0.2500 | 12.0000 | 71.9789 | 0.0990 | 2.8793 | 0.4680 | -0.2673 | 12.0601 | 72.2997 | 1.1186 | 0.8748 | 4.1186 | 0.5000 | -0.2500 | 12.0000 | 71.9789 | 0.8690 | 1.5185 | 7.8874 | 0.3892 | -0.3287 | 11.3301 | 68.3247 | 1.0000 | 1.0000 | 4.0769 | 0.5000 | -0.2500 | 12.0000 | 71.9789 | 1.3511 | 1.7080e-06 | 1.4988e-05 | 0.0068 | 15.9909 | 88.6865 | 0.4929 | 1.2089 | 0.8543 | 3.9881 | 0.5000 | 16.0000 | 88.7228 | -0.2500 | 16 | 88.7228 |
|---|
| 26 | 127 | 1.3199 | 6.5976 | 0.3788 | -0.1000 | 15.1515 | 85.9597 | 1.0000 | 4.1108 | 0.5000 | -0.0758 | 16.1212 | 89.9744 | 2.1886e-04 | 3.5074 | 0.4997 | -0.0758 | 16.1212 | 89.9744 | 1.6011 | 0.0922 | 7.5940 | 0.3788 | -0.1000 | 15.1515 | 85.9597 | 1.6127 | 0.4302 | 7.2717 | 0.3788 | -0.1000 | 15.1515 | 85.9597 | 1.0000 | 1.0000 | 4.1109 | 0.5000 | -0.0758 | 16.1212 | 89.9744 | 1.3410 | 4.7227e-07 | 4.3497e-06 | 0.0199 | 16.4206 | 91.1807 | 0.7917 | 1.0537 | 1.0063 | 4.0090 | 0.5000 | 16.5000 | 91.4954 | -0.0758 | 16.5000 | 91.4954 |
|---|
| 27 | 128 | 1.1105 | 5.4754 | 0.0455 | -0.4992 | 2.7273 | 20.1109 | 0.9999 | 4.0259 | 0.5000 | -0.0455 | 16.3636 | 90.9492 | 0.0059 | 3.2632 | 0.4997 | -0.0455 | 16.3638 | 90.9500 | 1.5374 | 0.1434 | 6.6391 | 0.0455 | -0.4993 | 2.7273 | 20.1102 | 1.3731 | 0.5326 | 5.0621 | 0.0455 | -0.4993 | 2.7273 | 20.1108 | 1.0000 | 1.0000 | 4.0259 | 0.5000 | -0.0455 | 16.3636 | 90.9492 | 1.6076 | 0.0710 | 0.5459 | 4.9157 | 3.0000 | 138.1563 | 1.6839 | 0.0764 | 0.5002 | 7.8819 | 0.0455 | 2.8636 | 24.4078 | -0.0455 | 16.5000 | 91.4954 |
|---|
| 28 | 129 | 1.0015 | 4.0834 | 0.5000 | -0.0538 | 16.0615 | 89.3538 | 1.2775 | 4.1303 | 0.3546 | -0.0728 | 14.6884 | 83.7148 | 0.9474 | 4.0054 | 0.3480 | -0.0742 | 14.6884 | 83.7148 | 1.8421 | 2.0000 | 10.0000 | 0.3068 | -0.2282 | 14.1189 | 80.8510 | 0.9305 | 1.0349 | 4.0088 | 0.5000 | -0.0538 | 16.0615 | 89.3538 | 1.0000 | 1.0000 | 4.0383 | 0.3546 | -0.0728 | 14.6884 | 83.7148 | 0.1214 | 0.3734 | 1.1635 | 0.1777 | 15.0654 | 85.3551 | 0.4476 | 0.2130 | 2.0000 | 9.9998 | 0.3165 | 14.2687 | 82.0730 | -0.0538 | 16.2500 | 90.1091 |
|---|
| 29 | 130 | 1.3685 | 4.1780 | 0.0157 | -0.4978 | 0.9688 | 8.9046 | 0.9999 | 4.0163 | 0.5000 | -0.0156 | 15.9844 | 88.6603 | 0.0469 | 3.3388 | 0.5000 | -0.0156 | 15.9846 | 88.6611 | 1.4806 | 0.1398 | 6.9681 | 0.0157 | -0.4981 | 0.9688 | 8.9041 | 1.6125 | 0.6701 | 4.2098 | 0.0157 | -0.4979 | 0.9688 | 8.9048 | 1.0000 | 1.0000 | 4.0163 | 0.5000 | -0.0156 | 15.9844 | 88.6603 | 1.5779 | 0.1042 | 0.6468 | 6.8897 | 1.0000 | 46.0524 | 1.6636 | 0.0772 | 0.5247 | 7.6633 | 0.0156 | 0.9844 | 10.3021 | -0.0156 | 16 | 88.7228 |
|---|
| 30 | 131 | 1.7653 | 4.6747 | 0.0909 | -0.1667 | 5.3939 | 39.4657 | 0.9999 | 4.0276 | 0.5000 | -0.0303 | 16.4394 | 91.2529 | 0.0168 | 2.8024 | 0.4973 | -0.0307 | 16.4428 | 91.2664 | 1.5988 | 0.0942 | 7.5168 | 0.0909 | -0.1667 | 5.3939 | 39.4657 | 1.5293 | 0.6593 | 7.4129 | 0.0909 | -0.1667 | 5.3939 | 39.4657 | 1.0000 | 1.0000 | 4.0276 | 0.5000 | -0.0303 | 16.4394 | 91.2529 | 1.6006 | 0.0863 | 0.6660 | 5.9401 | 6.0000 | 285.5214 | 1.6630 | 0.0742 | 0.4817 | 8.0497 | 0.0909 | 5.4545 | 40.2120 | -0.0303 | 16.5000 | 91.4954 |
|---|
| 31 | 132 | 0.9965 | 4.0807 | 0.5000 | -0.0909 | 15.9545 | 89.3014 | 0.3469 | 3.9380 | 0.1201 | -0.3758 | 6.4822 | 43.0653 | 1.1649 | 5.5148 | 0.1200 | -0.3788 | 6.4848 | 43.0681 | 0.4803 | 1.4681 | 6.7319 | 0.1212 | -0.3750 | 6.4848 | 43.0681 | 0.2211 | 1.9994 | 9.9970 | 0.1061 | -0.5000 | 6.4438 | 41.3417 | 1.0000 | 1.0000 | 4.0995 | 0.1212 | -0.3750 | 6.4848 | 43.0681 | 2.5178e-08 | 1.1479e-07 | 0.4331 | 4.5742 | 7.0303 | 48.7520 | 0.5791 | 1.4915 | 0.9804 | 5.3504 | 0.1212 | 7.0303 | 48.7520 | -0.0909 | 16.5000 | 91.4954 |
|---|
| 32 | 133 | 0.9972 | 4.0112 | 0.5000 | -0.1364 | 15.2727 | 86.5236 | 0.0476 | 2.4669 | 0.1340 | -0.5000 | 13.9076 | 80.5246 | 0.8999 | 3.9821 | 0.3719 | -0.1834 | 14.3030 | 82.2720 | 0.4854 | 0.8246 | 0.3397 | 0.2110 | -0.3937 | 13.8225 | 80.5423 | 0.0158 | 1.7445 | 8.4715 | 0.3788 | -0.1800 | 14.3030 | 82.2721 | 0.0867 | 3.4871e-08 | 1.3538 | 0.1340 | -0.5000 | 13.9076 | 80.5246 | 7.4038e-07 | 1.3386 | 6.4444e-06 | 0.0634 | 15.0985 | 85.7632 | 4.6123e-06 | 1.3342 | 4.0143e-05 | 0.0637 | 8.3208e-05 | 15.0985 | 85.7632 | -0.1364 | 16.5000 | 91.4954 |
|---|
| 33 | 134 | 0.2784 | 3.7897 | 0.3565 | -0.1346 | 14.6201 | 83.3339 | 1.0000 | 3.9757 | 0.5000 | -0.0846 | 15.7846 | 88.2386 | 0.0013 | 3.4811 | 0.4952 | -0.0855 | 15.7848 | 88.2394 | 0.3047 | 0.0215 | 4.1617 | 0.3549 | -0.1377 | 14.6177 | 83.3200 | 0.8693 | 1.6731 | 8.8258 | 0.3344 | -0.1796 | 14.5390 | 82.9374 | 1.0000 | 1.0000 | 3.9757 | 0.5000 | -0.0846 | 15.7846 | 88.2386 | 1.3559 | 4.9393e-07 | 4.6904e-06 | 0.0343 | 16.0232 | 89.2267 | 0.7350 | 1.1587 | 0.9122 | 4.0384 | 0.5000 | 16.2500 | 90.1091 | -0.0846 | 16.2500 | 90.1091 |
|---|
| 34 | 135 | 1.0019 | 4.0812 | 0.5000 | -0.0692 | 15.9385 | 88.8590 | 1.2789 | 4.1199 | 0.3835 | -0.0926 | 15.0568 | 85.2241 | 0.9684 | 4.0389 | 0.3788 | -0.0938 | 15.0568 | 85.2241 | 1.5017 | 1.9960 | 9.9867 | 0.3549 | -0.1499 | 14.7627 | 83.8164 | 0.8694 | 1.0663 | 3.9969 | 0.5000 | -0.0692 | 15.9385 | 88.8590 | 1.1046 | 1.0352 | 4.0900 | 0.3835 | -0.0926 | 15.0568 | 85.2241 | 2.0210e-07 | 3.5272e-07 | 1.2842 | 0.3660 | 15.3846 | 86.6162 | 0.3310 | 0.1087 | 1.9998 | 9.9990 | 0.3629 | 15.2218 | 85.9427 | -0.0692 | 16.2500 | 90.1091 |
|---|
| 35 | 136 | 1.3006 | 6.6521 | 0.1061 | -0.3571 | 5.8788 | 40.2556 | 1.0000 | 4.0377 | 0.5000 | -0.0758 | 16.1212 | 89.9744 | 0.1901 | 3.4870 | 0.5000 | -0.0758 | 16.1212 | 89.9744 | 1.5823 | 0.0975 | 7.4361 | 0.1061 | -0.3571 | 5.8788 | 40.2556 | 1.5240 | 0.3111 | 6.0326 | 0.1061 | -0.3571 | 5.8788 | 40.2556 | 1.0000 | 1.0000 | 4.0377 | 0.5000 | -0.0758 | 16.1212 | 89.9744 | 1.5307 | 0.0804 | 0.3959 | 5.1966 | 7.0000 | 326.9682 | 1.7330 | 0.0577 | 0.6606 | 7.6153 | 0.1061 | 6.2576 | 44.6423 | -0.0758 | 16.5000 | 91.4954 |
|---|
| 36 | 138 | 0.9988 | 4.0106 | 0.5000 | -0.1379 | 13.3966 | 75.9335 | 1.3280 | 4.1072 | 0.3571 | -0.2000 | 12.2143 | 70.6853 | 0.9278 | 3.9878 | 0.3499 | -0.2041 | 12.2143 | 70.6853 | 0.3335 | 1.5839 | 4.5184 | 0.3571 | -0.2000 | 12.2143 | 70.6853 | 0.9086 | 1.0451 | 4.0016 | 0.5000 | -0.1379 | 13.3966 | 75.9335 | 1.0000 | 1.0000 | 4.0213 | 0.3571 | -0.2000 | 12.2143 | 70.6853 | 2.6759e-09 | 4.9825e-09 | 1.3486 | 0.4200 | 13.3966 | 75.9335 | 0.4939 | 1.5546 | 1.1010 | 4.0685 | 0.3621 | 13.3966 | 75.9335 | -0.1379 | 14.5000 | 80.4051 |
|---|
| 37 | 140 | 0.9985 | 4.0233 | 0.5000 | 0.0385 | 16.1538 | 89.7241 | 1.3793 | 4.1535 | 0.4006 | 0.0461 | 15.5114 | 87.1219 | 0.8853 | 3.9692 | 0.3945 | 0.0468 | 15.5114 | 87.1219 | 0.8087 | 1.3110 | 6.2211 | 0.3952 | 0.0400 | 15.4912 | 87.0408 | 0.6520 | 1.1702 | 3.9787 | 0.5000 | 0.0385 | 16.1538 | 89.7241 | 1.2526 | 1.0595 | 4.1428 | 0.4006 | 0.0461 | 15.5114 | 87.1219 | 0.0677 | 3.0698e-07 | 1.3422 | 0.3983 | 15.5871 | 87.4391 | 1.5033 | 1.5452 | 1.9997 | 9.9987 | 0.3692 | 15.1388 | 85.6121 | 0.0385 | 16.2500 | 90.1091 |
|---|
| 38 | 141 | 1.3041 | 6.5154 | 0.4768 | -0.0412 | 15.6169 | 86.8030 | 1.0000 | 4.0192 | 0.5000 | -0.0397 | 15.6508 | 86.9393 | 0.0124 | 3.5995 | 0.4888 | -0.0406 | 15.6515 | 86.9422 | 1.8446 | 1.9998 | 9.9991 | 0.4265 | -0.0862 | 15.3828 | 85.8458 | 1.4720 | 0.5949 | 6.9554 | 0.4768 | -0.0412 | 15.6169 | 86.8030 | 1.0000 | 1.0000 | 4.0192 | 0.5000 | -0.0397 | 15.6508 | 86.9393 | 0.8182 | 0.2485 | 0.8958 | 5.0677e-07 | 15.7500 | 87.3366 | 1.6831 | 0.0729 | 0.5111 | 7.4509 | 0.4762 | 15.7143 | 87.1936 | -0.0397 | 15.7500 | 87.3365 |
|---|
| 39 | 142 | 1.3028 | 6.5149 | 0.4242 | -0.0714 | 15.8788 | 88.9792 | 1.0000 | 4.0514 | 0.5000 | -0.0606 | 16.2576 | 90.5233 | 0.0448 | 1.7438 | 0.4684 | -0.0648 | 16.2700 | 90.5739 | 1.5743 | 0.0900 | 7.2495 | 0.4242 | -0.0714 | 15.8788 | 88.9792 | 1.4040 | 0.5811 | 7.4204 | 0.4242 | -0.0714 | 15.8788 | 88.9792 | 1.0000 | 1 | 4.0514 | 0.5000 | -0.0606 | 16.2576 | 90.5233 | 1.3305 | 1.0211e-06 | 1.1015e-05 | 0.0199 | 16.4211 | 91.1806 | 0.6911 | 1.0701 | 1.0675 | 4.0015 | 0.5000 | 16.5000 | 91.4954 | -0.0606 | 16.5000 | 91.4954 |
|---|
| 40 | 143 | 1.3048 | 6.5531 | 0.2633 | -0.2122 | 11.7424 | 70.4117 | 1.0000 | 4.0810 | 0.5000 | -0.1154 | 15.3846 | 86.6162 | 5.3342e-08 | 1.9997 | 0.2549 | -0.2263 | 15.3846 | 86.6162 | 1.5982 | 0.0940 | 7.6707 | 0.2633 | -0.2122 | 11.7424 | 70.4117 | 1.8968 | 0.5288 | 4.8941 | 0.2633 | -0.2122 | 11.7424 | 70.4117 | 1.0000 | 1.0000 | 4.0810 | 0.5000 | -0.1154 | 15.3846 | 86.6162 | 0.0450 | 6.3723e-09 | 4.8005e-08 | 3.3877 | 14.2298 | 82.1317 | 0.3662 | 1.5365 | 1.1405 | 4.0027 | 0.5000 | 16.2500 | 90.1092 | -0.1154 | 16.2500 | 90.1091 |
|---|
| 41 | 144 | 1.3653 | 4.1687 | 0.0162 | -0.4979 | 0.9677 | 8.8401 | 0.9999 | 4.0270 | 0.5000 | -0.0238 | 15.7143 | 87.1936 | 0.0159 | 2.7426 | 0.4974 | -0.0247 | 15.7178 | 87.2076 | 1.4962 | 0.1352 | 6.6320 | 0.0162 | -0.4981 | 0.9677 | 8.8395 | 1.6052 | 0.6789 | 4.2084 | 0.0162 | -0.4980 | 0.9677 | 8.8403 | 1.0000 | 1.0000 | 4.0270 | 0.5000 | -0.0238 | 15.7143 | 87.1936 | 1.5973 | 0.0804 | 0.5898 | 4.9827 | 1.0000 | 46.0537 | 1.6649 | 0.0806 | 0.5158 | 7.7245 | 0.0159 | 0.9841 | 10.2703 | -0.0238 | 15.7500 | 87.3365 |
|---|
| 42 | 145 | 0.1555 | 3.7859 | 0.4753 | -0.1394 | 14.4429 | 81.6657 | 1.0000 | 4.0303 | 0.5000 | -0.1290 | 14.4677 | 81.7741 | 0.7924 | 3.7215 | 0.4981 | -0.1293 | 14.4713 | 81.7894 | 1.8441 | 1.7813 | 9.5324 | 0.4296 | -0.1926 | 14.2482 | 80.7915 | 0.6963 | 1.9996 | 9.9982 | 0.4449 | -0.1789 | 14.3365 | 81.1959 | 1.0000 | 1.0000 | 4.0303 | 0.5000 | -0.1290 | 14.4677 | 81.7741 | 0.9383 | 0.0235 | 0.1726 | 0.0038 | 15.5173 | 86.0194 | 0.7910 | 1.2691 | 1.0430 | 4.2908 | 0.5000 | 15.5000 | 85.9503 | -0.1290 | 15.5000 | 85.9503 |
|---|
| 43 | 146 | 1.2639 | 6.2720 | 0.4697 | -0.1452 | 15.2121 | 86.2615 | 1.0000 | 4.0158 | 0.5000 | -0.1364 | 15.2727 | 86.5236 | 0.0062 | 3.5485 | 0.4965 | -0.1373 | 15.2729 | 86.5243 | 1.6259 | 1.6609 | 8.9648 | 0.4761 | -0.1547 | 15.2449 | 86.4050 | 1.4280 | 0.6511 | 6.9170 | 0.4697 | -0.1452 | 15.2121 | 86.2615 | 1.0000 | 1.0000 | 4.0158 | 0.5000 | -0.1364 | 15.2727 | 86.5236 | 0.7675 | 0.6007 | 0.9604 | 7.2121e-07 | 16.5000 | 91.4955 | 1.6550 | 0.0815 | 0.5141 | 7.8016 | 0.4697 | 16.4394 | 91.2529 | -0.1364 | 16.5000 | 91.4954 |
|---|
| 44 | 147 | 0.9862 | 4.0113 | 0.5000 | -0.0077 | 16.2462 | 90.0938 | 1.0133 | 4.9960 | 8.5216e-05 | 9.3067e-04 | 4.7218e-07 | 0.0111 | 1.5265 | 7.6346 | 4.5476e-05 | -0.0027 | 1.9064e-07 | 0.0070 | 0.5635 | 1.3352 | 5.8344 | 4.3832e-05 | -0.0021 | 1.3269e-07 | 0.0058 | 0.0521 | 1.6938 | 8.2276 | 3.9041e-05 | 0.0323 | 1.6941e-07 | 0.0065 | 1.0002 | 1.0001 | 4.9968 | 8.5243e-05 | 0.0014 | 4.7237e-07 | 0.0111 | 0.6402 | 1.2884 | 0.7174 | 4.8562 | 1.8534e-08 | 7.2232e-04 | 0.5931 | 1.2691 | 1.0670 | 4.3408 | 4.3729e-05 | 1.3274e-07 | 0.0059 | -0.0077 | 16.2500 | 90.1091 |
|---|
| 45 | 148 | 1.3455 | 4.2034 | 0.0909 | -0.4996 | 4.9091 | 31.2980 | 1.0000 | 4.0818 | 0.5000 | -0.0909 | 15.9545 | 89.3014 | 5.8531e-04 | 3.0748 | 0.4961 | -0.0916 | 15.9547 | 89.3022 | 1.5016 | 0.1372 | 6.6883 | 0.0909 | -0.4997 | 4.9091 | 31.2974 | 1.2943 | 0.6652 | 6.0490 | 0.0909 | -0.4997 | 4.9091 | 31.2974 | 1.0000 | 1.0000 | 4.0819 | 0.5000 | -0.0909 | 15.9545 | 89.3014 | 1.6205 | 0.0868 | 0.5270 | 5.2278 | 6.0000 | 276.3113 | 1.6954 | 0.0726 | 0.4920 | 7.9097 | 0.0909 | 5.4545 | 40.2120 | -0.0909 | 16.5000 | 91.4954 |
|---|
| 46 | 149 | 0.5159 | 2.7540 | 3.2517e-06 | -0.1337 | 0.9447 | 7.8535 | 0.9999 | 4.0163 | 0.5000 | 0.0152 | 16.4849 | 91.4348 | 0.0233 | 1.4839 | 0.4974 | 0.0152 | 16.4877 | 91.4464 | 1.9997 | 0.3635 | 9.9986 | 2.8582e-05 | -0.3141 | 0.5013 | 2.8825 | 0.8776 | 1.0586 | 4.0682 | 5.0613e-05 | 0.0521 | 0.9444 | 7.7305 | 1.0000 | 1.0000 | 4.0163 | 0.5000 | 0.0152 | 16.4849 | 91.4348 | 1.9997 | 0.3635 | 5.5497e-04 | 9.9986 | 0.5013 | 2.8788 | 1.9999 | 0.3635 | 1.1126e-04 | 9.9997 | 2.5650e-06 | 0.5013 | 2.8790 | 0.0152 | 16.5000 | 91.4954 |
|---|
| 47 | 150 | 0.9986 | 4.0157 | 0.5000 | -0.1129 | 14.7097 | 82.7615 | 1.3092 | 4.1562 | 0.3260 | -0.1645 | 12.8354 | 74.6745 | 0.9112 | 3.9951 | 0.3167 | -0.1694 | 12.8354 | 74.6745 | 0.5020 | 0.8557 | 3.9300 | 0.3174 | -0.1810 | 12.8071 | 74.4496 | 0.9056 | 1.0466 | 4.0011 | 0.5000 | -0.1129 | 14.7097 | 82.7615 | 1.0000 | 1.0000 | 4.0252 | 0.3260 | -0.1645 | 12.8354 | 74.6745 | 5.1432e-07 | 1.3422 | 1.4554e-05 | 0.0872 | 13.2438 | 76.5946 | 0.5844 | 1.2976 | 1.1168 | 6.1030 | 0.3226 | 13.5484 | 77.9711 | -0.1129 | 15.5000 | 85.9503 |
|---|
| 48 | 151 | 1.3419 | 6.6810 | 0.2214 | -0.1774 | 9.7286 | 59.6184 | 0.9999 | 4.0196 | 0.5000 | -0.0690 | 14.2241 | 79.2981 | 0.2081 | 3.5229 | 0.5000 | -0.0690 | 14.2241 | 79.2981 | 1.5787 | 0.0851 | 7.4717 | 0.2214 | -0.1774 | 9.7286 | 59.6184 | 1.5440 | 0.6480 | 7.2228 | 0.2214 | -0.1774 | 9.7286 | 59.6184 | 1.0000 | 1.0000 | 4.0197 | 0.5000 | -0.0690 | 14.2241 | 79.2981 | 0.3407 | 1.1267e-07 | 3.7153e-07 | 0.6340 | 11.3639 | 67.8236 | 1.6671 | 0.0724 | 0.4711 | 8.1409 | 0.2241 | 10.0862 | 61.7231 | -0.0690 | 14.5000 | 80.4051 |
|---|
| 49 | 152 | 0.2807 | 3.8646 | 0.1759 | -0.0982 | 10.3787 | 65.0776 | 0.9999 | 4.0167 | 0.5000 | -0.0152 | 16.4849 | 91.4348 | 0.0753 | 3.4603 | 0.5000 | -0.0151 | 16.4850 | 91.4355 | 1.5400 | 0.0889 | 7.3638 | 0.1970 | -0.0385 | 10.4242 | 65.3989 | 0.9052 | 1.5774 | 7.9416 | 0.1727 | -0.1140 | 10.3596 | 64.9554 | 1.0000 | 1.0000 | 4.0167 | 0.5000 | -0.0152 | 16.4849 | 91.4348 | 1.3498 | 8.4464e-07 | 6.7136e-06 | 0.2413 | 11.3573 | 72.9872 | 0.9559 | 2.6209e-07 | 1.5314 | 7.4475 | 0.1875 | 10.4236 | 65.3969 | -0.0152 | 16.5000 | 91.4954 |
|---|
| 50 | 153 | 0.9985 | 4.0227 | 0.5000 | 0.0455 | 16.3636 | 90.9492 | 1.0332 | 4.9878 | 0.4091 | 0.0556 | 15.8182 | 88.7364 | 0.9333 | 3.9785 | 0.4045 | 0.0562 | 15.8182 | 88.7364 | 0.4623 | 1.4366 | 6.3994 | 0.4091 | 0.0556 | 15.8182 | 88.7364 | 0.8066 | 1.1014 | 3.9926 | 0.5000 | 0.0455 | 16.3636 | 90.9492 | 0.9988 | 0.9991 | 5.0001 | 0.4091 | 0.0556 | 15.8182 | 88.7364 | 3.7629e-08 | 2.3571e-09 | 0.1064 | 3.4550 | 15.9545 | 89.3014 | 0.5263 | 1.5088 | 1.1391 | 4.3597 | 0.4091 | 15.9545 | 89.3014 | 0.0455 | 16.5000 | 91.4954 |
|---|
| 51 | 154 | 1.3200 | 6.6039 | 0.0303 | -2.5112e-07 | 1.9394 | 17.9248 | 0.9999 | 3.9848 | 0.5000 | 1.1207e-08 | 16.5000 | 91.4954 | 0.0450 | 3.4878 | 0.4992 | -2.8185e-05 | 16.5037 | 91.5102 | 1.5854 | 0.0954 | 7.4002 | 0.0303 | 1.1926e-07 | 1.9394 | 17.9248 | 1.5264 | 0.6867 | 7.3045 | 0.0303 | -2.5745e-06 | 1.9394 | 17.9248 | 1.0000 | 1.0000 | 3.9824 | 0.5000 | -8.7941e-09 | 16.5000 | 91.4954 | 1.5692 | 0.0882 | 0.4910 | 5.5834 | 2.0000 | 96.7094 | 1.6780 | 0.0773 | 0.5124 | 7.8343 | 0.0303 | 1.9394 | 17.9248 | 0 | 16.5000 | 91.4954 |
|---|
| 52 | 155 | 1.2974 | 6.4883 | 0.0606 | 0.2500 | 3.6970 | 29.0684 | 0.9999 | 4.0271 | 0.5000 | 0.0303 | 16.4394 | 91.2529 | 0.0151 | 2.5855 | 0.4971 | 0.0309 | 16.4427 | 91.2662 | 2.0000 | 0.3627 | 10.0000 | 0.0469 | 0.1775 | 3.3472 | 26.6615 | 1.5322 | 0.6975 | 7.3406 | 0.0606 | 0.2500 | 3.6970 | 29.0684 | 1.0000 | 1.0000 | 4.0271 | 0.5000 | 0.0303 | 16.4394 | 91.2529 | 2.0000 | 0.3635 | 2.7837e-05 | 9.9999 | 3.5013 | 150.2441 | 2.0000 | 0.3634 | 4.4299e-05 | 9.9999 | 0.0463 | 3.3648 | 27.0914 | 0.0303 | 16.5000 | 91.4954 |
|---|
| 53 | 156 | 0.9863 | 4.0115 | 0.5000 | -0.0152 | 16.4849 | 91.4348 | 1.3358 | 4.1876 | 0.0455 | -0.1667 | 2.8485 | 24.0521 | 1.2406 | 4.3673 | 0.0414 | -0.1830 | 2.8485 | 24.0521 | 0.4871 | 0.9713 | 2.1334 | 0.0310 | -0.3874 | 2.7664 | 21.6690 | 0.0179 | 1.7716 | 8.6653 | 0.0455 | -0.1667 | 2.8485 | 24.0521 | 1.0000 | 1.0000 | 4.0668 | 0.0455 | -0.1667 | 2.8485 | 24.0521 | 0.6021 | 1.2822 | 1.3695e-05 | 1.2703 | 2.8608 | 52.6651 | 0.6155 | 1.2715 | 1.0495e-05 | 1.6229 | 0.0337 | 2.8010 | 23.5251 | -0.0152 | 16.5000 | 91.4954 |
|---|
| 54 | 157 | 0.1554 | 1.4875 | 0.1110 | -0.5000 | 13.5727 | 78.7401 | 1.0000 | 4.0518 | 0.5000 | -0.0538 | 16.0615 | 89.3538 | 0.0145 | 3.2878 | 0.4932 | -0.0546 | 16.0652 | 89.3687 | 0.9427 | 0.2619 | 0.4332 | 0.1366 | -0.5000 | 13.3486 | 77.7072 | 0.3987 | 0.9865 | 1.8186 | 0.1660 | -0.5000 | 13.2897 | 77.3106 | 1.0000 | 1.0000 | 4.0518 | 0.5000 | -0.0538 | 16.0615 | 89.3538 | 1.3383 | 2.4066e-06 | 3.4647e-06 | 0.1049 | 14.4041 | 82.6456 | 2.0000 | 0.3613 | 7.6701e-06 | 10.0000 | 0.3177 | 14.1613 | 81.5627 | -0.0538 | 16.2500 | 90.1091 |
|---|
| 55 | 158 | 1.0038 | 4.0527 | 0.5000 | -0.0606 | 16.2576 | 90.5234 | 0.4201 | 3.8853 | 0.4848 | -0.0625 | 16.2424 | 90.4616 | 0.9399 | 4.7412 | 0.4845 | -0.0625 | 16.2424 | 90.4618 | 0.5233 | 1.4623 | 6.3015 | 0.4848 | -0.0625 | 16.2424 | 90.4618 | 1.1020 | 0.9388 | 4.0516 | 0.5000 | -0.0606 | 16.2576 | 90.5233 | 1.0000 | 1.0000 | 4.0499 | 0.4848 | -0.0625 | 16.2424 | 90.4618 | 0.7782 | 1.0071 | 1.2064 | 5.2343e-08 | 16.5000 | 91.4954 | 0.5155 | 1.3496 | 1.1118 | 4.1744 | 0.4848 | 16.4848 | 91.4348 | -0.0606 | 16.5000 | 91.4954 |
|---|
| 56 | 160 | 0.9890 | 4.0110 | 0.5000 | 0.0231 | 16.2154 | 89.9706 | 1.3348 | 4.1284 | 0.1534 | 0.0926 | 8.4091 | 55.4065 | 1.0774 | 4.0597 | 0.1446 | 0.0983 | 8.4091 | 55.4065 | 0.4524 | 1.4604 | 6.4163 | 0.1534 | 0.0926 | 8.4091 | 55.4065 | 0.2192 | 1.9999 | 9.9994 | 0.1413 | 0.0576 | 8.3843 | 55.2418 | 1.0000 | 1.0000 | 4.0358 | 0.1534 | 0.0926 | 8.4091 | 55.4065 | 0.0099 | 3.3245e-08 | 0.3853 | 5.0674 | 8.4505 | 55.7443 | 0.5088 | 1.5533 | 0.9981 | 4.5066 | 0.1538 | 8.4615 | 55.8120 | 0.0231 | 16.2500 | 90.1091 |
|---|
| 57 | 161 | 0.9933 | 4.0741 | 0.5000 | -0.1615 | 14.5538 | 83.2013 | 1.2048 | 4.1066 | 0.4129 | -0.1972 | 14.0606 | 80.9767 | 0.9237 | 3.9903 | 0.4084 | -0.1994 | 14.0606 | 80.9767 | 0.3490 | 1.5833 | 4.7439 | 0.4129 | -0.1972 | 14.0606 | 80.9767 | 0.0197 | 1.8093 | 8.9545 | 0.4129 | -0.1972 | 14.0606 | 80.9767 | 1.1381 | 0.9875 | 4.0685 | 0.4129 | -0.1972 | 14.0606 | 80.9767 | 6.9204e-08 | 1.6940e-07 | 1.3433 | 0.2544 | 15.7846 | 88.2386 | 0.4725 | 1.4255 | 1.0789 | 4.4543 | 0.4154 | 15.7846 | 88.2386 | -0.1615 | 16.2500 | 90.1091 |
|---|
Let's glance at the trial level data for these alternative models.
altTrialData
altTrialData = 3704×17 table
| | SubjectID | a0 | b0 | a1 | b1 | a2 | b2 | Chose1 | alphaOnly_Prob1 | deltaOnly_Prob1 | rhoOnly_Prob1 | ad_Prob1 | ar_Prob1 | dr_Prob1 | noEpsilon_Prob1 | noGamma_Prob1 | gammaOnly_Prob1 |
|---|
| 1 | 101 | 16 | 5 | 14 | 7 | 8 | 13 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 2 | 101 | 16 | 1 | 5 | 12 | 15 | 2 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 3 | 101 | 16 | 3 | 15 | 4 | 5 | 14 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 4 | 101 | 5 | 16 | 11 | 10 | 7 | 14 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 5 | 101 | 15 | 2 | 8 | 9 | 13 | 4 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 6 | 101 | 5 | 16 | 7 | 14 | 12 | 9 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 7 | 101 | 1 | 15 | 12 | 4 | 2 | 14 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 8 | 101 | 2 | 15 | 10 | 7 | 4 | 13 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 9 | 101 | 15 | 3 | 8 | 10 | 14 | 4 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 10 | 101 | 3 | 15 | 10 | 8 | 5 | 13 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 11 | 101 | 15 | 1 | 6 | 10 | 14 | 2 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 12 | 101 | 3 | 15 | 11 | 7 | 4 | 14 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 13 | 101 | 15 | 3 | 11 | 7 | 8 | 10 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 14 | 101 | 14 | 2 | 13 | 3 | 4 | 12 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 15 | 101 | 15 | 2 | 7 | 10 | 14 | 3 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 16 | 101 | 3 | 16 | 12 | 7 | 4 | 15 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 17 | 101 | 15 | 2 | 11 | 6 | 7 | 10 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 18 | 101 | 2 | 16 | 5 | 13 | 11 | 7 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 19 | 101 | 2 | 16 | 4 | 14 | 12 | 6 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 20 | 101 | 1 | 16 | 6 | 11 | 9 | 8 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 21 | 101 | 3 | 16 | 5 | 14 | 12 | 7 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 22 | 101 | 16 | 2 | 13 | 5 | 6 | 12 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 23 | 101 | 15 | 3 | 14 | 4 | 5 | 13 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 24 | 101 | 3 | 16 | 11 | 8 | 5 | 14 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 25 | 101 | 3 | 15 | 4 | 14 | 12 | 6 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 26 | 101 | 3 | 16 | 6 | 13 | 11 | 8 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 27 | 101 | 16 | 3 | 9 | 10 | 14 | 5 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 28 | 101 | 15 | 2 | 12 | 5 | 6 | 11 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 29 | 101 | 1 | 15 | 9 | 7 | 4 | 12 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 30 | 101 | 15 | 1 | 7 | 9 | 13 | 3 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 31 | 101 | 2 | 16 | 13 | 5 | 3 | 15 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 32 | 101 | 2 | 15 | 3 | 14 | 12 | 5 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 33 | 101 | 16 | 5 | 15 | 6 | 7 | 14 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 34 | 101 | 15 | 3 | 12 | 6 | 7 | 11 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 35 | 101 | 16 | 2 | 7 | 11 | 15 | 3 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 36 | 101 | 5 | 16 | 6 | 15 | 13 | 8 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 37 | 101 | 15 | 1 | 14 | 2 | 3 | 13 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 38 | 101 | 13 | 2 | 7 | 8 | 12 | 3 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 39 | 101 | 4 | 16 | 7 | 13 | 11 | 9 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 40 | 101 | 2 | 16 | 11 | 7 | 4 | 14 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 41 | 101 | 16 | 2 | 11 | 7 | 8 | 10 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 42 | 101 | 1 | 15 | 11 | 5 | 2 | 14 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 43 | 101 | 16 | 3 | 8 | 11 | 15 | 4 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 44 | 101 | 3 | 16 | 10 | 9 | 6 | 13 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 45 | 101 | 2 | 14 | 9 | 7 | 4 | 12 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 46 | 101 | 2 | 14 | 10 | 6 | 3 | 13 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 47 | 101 | 16 | 2 | 15 | 3 | 4 | 14 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 48 | 101 | 3 | 16 | 4 | 15 | 13 | 6 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 49 | 101 | 5 | 16 | 12 | 9 | 6 | 15 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 50 | 101 | 3 | 15 | 5 | 13 | 11 | 7 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 51 | 101 | 16 | 5 | 10 | 11 | 15 | 6 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 52 | 101 | 16 | 1 | 7 | 10 | 14 | 3 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 53 | 101 | 2 | 15 | 4 | 13 | 11 | 6 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 54 | 101 | 1 | 16 | 13 | 4 | 2 | 15 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 55 | 101 | 16 | 2 | 8 | 10 | 14 | 4 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 56 | 101 | 2 | 16 | 12 | 6 | 3 | 15 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 57 | 101 | 15 | 3 | 13 | 5 | 6 | 12 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 58 | 101 | 2 | 16 | 6 | 12 | 10 | 8 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 59 | 101 | 16 | 2 | 12 | 6 | 7 | 11 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 60 | 101 | 16 | 5 | 13 | 8 | 9 | 12 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 61 | 101 | 15 | 2 | 14 | 3 | 4 | 13 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 62 | 101 | 16 | 1 | 6 | 11 | 15 | 2 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 63 | 101 | 14 | 2 | 7 | 9 | 13 | 3 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 64 | 101 | 1 | 15 | 3 | 13 | 11 | 5 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 65 | 101 | 15 | 2 | 13 | 4 | 5 | 12 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 |
|---|
| 66 | 101 | 2 | 15 | 11 | 6 | 3 | 14 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 |
|---|
| 67 | 102 | 16 | 5 | 14 | 7 | 8 | 13 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0514 | 0.0606 | 0.4242 | 0.0300 | 0.1364 | 0.5000 |
|---|
| 68 | 102 | 16 | 1 | 5 | 12 | 15 | 2 | 0 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 69 | 102 | 16 | 3 | 15 | 4 | 5 | 14 | 1 | 0.0606 | 0.4242 | 0.4242 | 0.1258 | 0.0606 | 0.4242 | 0.1306 | 0.1364 | 0.5000 |
|---|
| 70 | 102 | 5 | 16 | 11 | 10 | 7 | 14 | 0 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 71 | 102 | 15 | 2 | 8 | 9 | 13 | 4 | 0 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 72 | 102 | 5 | 16 | 7 | 14 | 12 | 9 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 5.9968e-05 | 0.1364 | 0.5000 |
|---|
| 73 | 102 | 1 | 15 | 12 | 4 | 2 | 14 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7877 | 0.7879 | 0.4242 | 0.9996 | 0.8636 | 0.5000 |
|---|
| 74 | 102 | 2 | 15 | 10 | 7 | 4 | 13 | 0 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 75 | 102 | 15 | 3 | 8 | 10 | 14 | 4 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 76 | 102 | 3 | 15 | 10 | 8 | 5 | 13 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 77 | 102 | 15 | 1 | 6 | 10 | 14 | 2 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 78 | 102 | 3 | 15 | 11 | 7 | 4 | 14 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 79 | 102 | 15 | 3 | 11 | 7 | 8 | 10 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0417 | 0.0606 | 0.4242 | 0.0094 | 0.1364 | 0.5000 |
|---|
| 80 | 102 | 14 | 2 | 13 | 3 | 4 | 12 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0934 | 0.0606 | 0.4242 | 0.0919 | 0.1364 | 0.5000 |
|---|
| 81 | 102 | 15 | 2 | 7 | 10 | 14 | 3 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 82 | 102 | 3 | 16 | 12 | 7 | 4 | 15 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 83 | 102 | 15 | 2 | 11 | 6 | 7 | 10 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0435 | 0.0606 | 0.4242 | 0.0138 | 0.1364 | 0.5000 |
|---|
| 84 | 102 | 2 | 16 | 5 | 13 | 11 | 7 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 8.9042e-05 | 0.1364 | 0.5000 |
|---|
| 85 | 102 | 2 | 16 | 4 | 14 | 12 | 6 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 1.9630e-04 | 0.1364 | 0.5000 |
|---|
| 86 | 102 | 1 | 16 | 6 | 11 | 9 | 8 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 2.7199e-05 | 0.1364 | 0.5000 |
|---|
| 87 | 102 | 3 | 16 | 5 | 14 | 12 | 7 | 1 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 1.3221e-04 | 0.1364 | 0.5000 |
|---|
| 88 | 102 | 16 | 2 | 13 | 5 | 6 | 12 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0594 | 0.0606 | 0.4242 | 0.0439 | 0.1364 | 0.5000 |
|---|
| 89 | 102 | 15 | 3 | 14 | 4 | 5 | 13 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0934 | 0.0606 | 0.4242 | 0.0919 | 0.1364 | 0.5000 |
|---|
| 90 | 102 | 3 | 16 | 11 | 8 | 5 | 14 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 91 | 102 | 3 | 15 | 4 | 14 | 12 | 6 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 1.9630e-04 | 0.1364 | 0.5000 |
|---|
| 92 | 102 | 3 | 16 | 6 | 13 | 11 | 8 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 5.9968e-05 | 0.1364 | 0.5000 |
|---|
| 93 | 102 | 16 | 3 | 9 | 10 | 14 | 5 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 94 | 102 | 15 | 2 | 12 | 5 | 6 | 11 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0514 | 0.0606 | 0.4242 | 0.0300 | 0.1364 | 0.5000 |
|---|
| 95 | 102 | 1 | 15 | 9 | 7 | 4 | 12 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 96 | 102 | 15 | 1 | 7 | 9 | 13 | 3 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 |
|---|
| 97 | 102 | 2 | 16 | 13 | 5 | 3 | 15 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7877 | 0.7879 | 0.4242 | 0.9996 | 0.8636 | 0.5000 |
|---|
| 98 | 102 | 2 | 15 | 3 | 14 | 12 | 5 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 2.9146e-04 | 0.1364 | 0.5000 |
|---|
| 99 | 102 | 16 | 5 | 15 | 6 | 7 | 14 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0725 | 0.0606 | 0.4242 | 0.0638 | 0.1364 | 0.5000 |
|---|
| 100 | 102 | 15 | 3 | 12 | 6 | 7 | 11 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0465 | 0.0606 | 0.4242 | 0.0204 | 0.1364 | 0.5000 |
|---|
| ⋮ |
|---|
Now we can compute BIC for these models
altSubjectData.BIC_M1 = altSubjectData.Deviance_M1 + log(n) * 4;
altSubjectData.BIC_M2 = altSubjectData.Deviance_M2 + log(n) * 4;
altSubjectData.BIC_M3 = altSubjectData.Deviance_M3 + log(n) * 4;
altSubjectData.BIC_M4 = altSubjectData.Deviance_M4 + log(n) * 5;
altSubjectData.BIC_M5 = altSubjectData.Deviance_M5 + log(n) * 5;
altSubjectData.BIC_M6 = altSubjectData.Deviance_M6 + log(n) * 5;
altSubjectData.BIC_M7 = altSubjectData.Deviance_M7 + log(n) * 4;
altSubjectData.BIC_M8 = altSubjectData.Deviance_M8 + log(n) * 5;
altSubjectData.BIC_M9 = altSubjectData.Deviance_M9 + log(n) * 1;
And now we can compare the BIC of all models
modelBIC = [sum(subjectData.BIC), sum(altSubjectData.BIC_M1), sum(altSubjectData.BIC_M2), sum(altSubjectData.BIC_M3), sum(altSubjectData.BIC_M4), ...
sum(altSubjectData.BIC_M5), sum(altSubjectData.BIC_M6), sum(altSubjectData.BIC_M7), sum(altSubjectData.BIC_M8), sum(altSubjectData.BIC_M9)];
[minBIC, idx] = min(modelBIC);
So model 4, which has everything except Rho
2.4 Validate the Best Model
First, let’s assess model performance at a basic level: we can look at prediction accuracy to begin
sum(altTrialData.Chose1 == round(altTrialData.ad_Prob1))/height(altTrialData)
Next, we can proceed to look at the distribution of model accuracy across participants:
[f, xi] = ksdensity(altSubjectData.BIC_M4);
title('Density of BIC M4');
Let’s look at the bottom 25% of model performance:
Q3 = prctile(altSubjectData.BIC_M4, 75);
worstExplained = find(altSubjectData.BIC_M4 > Q3);
filteredData = altTrialData(ismember(altTrialData.SubjectID, altSubjectData.SubjectID(worstExplained)), :);
x_diff = filteredData.a0 - filteredData.a1;
y_diff = filteredData.Chose1 - filteredData.ad_Prob1;
uniqueSubjects = unique(filteredData.SubjectID);
colors = lines(length(uniqueSubjects));
for i = 1:length(uniqueSubjects)
idx = filteredData.SubjectID == uniqueSubjects(i);
scatter(x_diff(idx), y_diff(idx), [], 'Color', colors(i));
ylabel('Chose1 - ad\_Prob1');
Nothing systematic here
x_diff = filteredData.a0 - filteredData.a2;
uniqueSubjects = unique(filteredData.SubjectID);
colors = lines(length(uniqueSubjects));
for i = 1:length(uniqueSubjects)
idx = filteredData.SubjectID == uniqueSubjects(i);
scatter(x_diff(idx), y_diff(idx), [], 'Color', colors(i));
ylabel('Chose1 - ad\_Prob1');
Systematically overpredicting choosing 1 when player A has more and undepredicted when player A has less. This suggests a bias that only seems to apply when looking at choice 2 because we don't see this on choice 1.
x_diff = filteredData.b0 - filteredData.b1;
uniqueSubjects = unique(filteredData.SubjectID);
colors = lines(length(uniqueSubjects));
for i = 1:length(uniqueSubjects)
idx = filteredData.SubjectID == uniqueSubjects(i);
scatter(x_diff(idx), y_diff(idx), [], 'Color', colors(i));
ylabel('Chose1 - ad\_Prob1');
This is fine
x_diff = filteredData.b0 - filteredData.b2;
uniqueSubjects = unique(filteredData.SubjectID);
colors = lines(length(uniqueSubjects));
for i = 1:length(uniqueSubjects)
idx = filteredData.SubjectID == uniqueSubjects(i);
scatter(x_diff(idx), y_diff(idx), [], 'Color', colors(i));
ylabel('Chose1 - ad\_Prob1');
Same issue as before - so it seems that these people have a preference/bias for player A when player B has more, but this only applies to Choice 1. Not very useful information, but we're dealing with a trialset that is probably too small for the number of participants we have. Anyways, we can't do anything with this information, we can just bear it in mind as we proceed to check assumptions: first linearity (we’ll do this across both choices)
groups = altTrialData.a0 < altTrialData.b0;
uniqueGroups = unique(groups);
titles = ["a0 < b0", "a0 > b0"];
for i = 1:length(uniqueGroups)
idx = groups == uniqueGroups(i);
plot(altTrialData.ad_Prob1(idx), altTrialData.Chose1(idx), colors(i));
This isn't great, but it's fine. Second, normality of error:
residuals = altTrialData.ad_Prob1 - altTrialData.Chose1;
residual_std = std(residuals);
normvals = normrnd(0, residual_std, 1000, 1);
[f, xi] = ksdensity(residuals, 'Bandwidth', residual_std);
[f_norm, xi_norm] = ksdensity(normvals, 'Bandwidth', residual_std);
plot(xi, f, 'LineWidth', 2, 'Color', 'b');
plot(xi_norm, f_norm, 'LineWidth', 2, 'Color', 'r');
legend({'Actual', 'Predicted'});
Looks very good. Third we can examine homoscedasticity:
x_vals = altTrialData.a1 - altTrialData.a2;
y_vals = altTrialData.ad_Prob1 - altTrialData.Chose1;
mdl = fitlm(x_vals, y_vals);
[x_sorted, sortIdx] = sort(x_vals);
[y_pred, y_ci] = predict(mdl, x_sorted);
fill([x_sorted; flipud(x_sorted)], [y_ci(:,1); flipud(y_ci(:,2))], 'k', ...
'FaceAlpha', 0.2, 'EdgeColor', 'none');
ylabel('ad\_Prob1 - Chose1');
Nice constant variance cloud across all X values. And finally independence of error:
groups = altTrialData.a0 < altTrialData.b0;
uniqueGroups = unique(groups);
titles = ["a0 < b0", "a0 > b0"];
for i = 1:length(uniqueGroups)
idx = groups == uniqueGroups(i);
x = altTrialData.a1(idx) - altTrialData.a2(idx);
diff = altTrialData.Chose1(idx) - altTrialData.ad_Prob1(idx);
plot(x, diff, colors(i));
ylabel('Chose1 - ad\_Prob1');
The model seems to have a slight bias, under-predicting the likelihood of choosing option 1 when Player A has less than Player B and Choice 2 is better for player A. But this seems to not be such a big issue.
Let’s assess the independence: i.e. the extent to which our model captures all differences in choice behavior between different people:
altTrialData.a0greaterThanb0 = altTrialData.a0 > altTrialData.b0
altTrialData = 3704×18 table
| | SubjectID | a0 | b0 | a1 | b1 | a2 | b2 | Chose1 | alphaOnly_Prob1 | deltaOnly_Prob1 | rhoOnly_Prob1 | ad_Prob1 | ar_Prob1 | dr_Prob1 | noEpsilon_Prob1 | noGamma_Prob1 | gammaOnly_Prob1 | a0greaterThanb0 |
|---|
| 1 | 101 | 16 | 5 | 14 | 7 | 8 | 13 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 2 | 101 | 16 | 1 | 5 | 12 | 15 | 2 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 3 | 101 | 16 | 3 | 15 | 4 | 5 | 14 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 4 | 101 | 5 | 16 | 11 | 10 | 7 | 14 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 5 | 101 | 15 | 2 | 8 | 9 | 13 | 4 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 6 | 101 | 5 | 16 | 7 | 14 | 12 | 9 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 7 | 101 | 1 | 15 | 12 | 4 | 2 | 14 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 8 | 101 | 2 | 15 | 10 | 7 | 4 | 13 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 9 | 101 | 15 | 3 | 8 | 10 | 14 | 4 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 10 | 101 | 3 | 15 | 10 | 8 | 5 | 13 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 11 | 101 | 15 | 1 | 6 | 10 | 14 | 2 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 12 | 101 | 3 | 15 | 11 | 7 | 4 | 14 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 13 | 101 | 15 | 3 | 11 | 7 | 8 | 10 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 14 | 101 | 14 | 2 | 13 | 3 | 4 | 12 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 15 | 101 | 15 | 2 | 7 | 10 | 14 | 3 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 16 | 101 | 3 | 16 | 12 | 7 | 4 | 15 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 17 | 101 | 15 | 2 | 11 | 6 | 7 | 10 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 18 | 101 | 2 | 16 | 5 | 13 | 11 | 7 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 19 | 101 | 2 | 16 | 4 | 14 | 12 | 6 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 20 | 101 | 1 | 16 | 6 | 11 | 9 | 8 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 21 | 101 | 3 | 16 | 5 | 14 | 12 | 7 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 22 | 101 | 16 | 2 | 13 | 5 | 6 | 12 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 23 | 101 | 15 | 3 | 14 | 4 | 5 | 13 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 24 | 101 | 3 | 16 | 11 | 8 | 5 | 14 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 25 | 101 | 3 | 15 | 4 | 14 | 12 | 6 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 26 | 101 | 3 | 16 | 6 | 13 | 11 | 8 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 27 | 101 | 16 | 3 | 9 | 10 | 14 | 5 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 28 | 101 | 15 | 2 | 12 | 5 | 6 | 11 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 29 | 101 | 1 | 15 | 9 | 7 | 4 | 12 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 30 | 101 | 15 | 1 | 7 | 9 | 13 | 3 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 31 | 101 | 2 | 16 | 13 | 5 | 3 | 15 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 32 | 101 | 2 | 15 | 3 | 14 | 12 | 5 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 33 | 101 | 16 | 5 | 15 | 6 | 7 | 14 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 34 | 101 | 15 | 3 | 12 | 6 | 7 | 11 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 35 | 101 | 16 | 2 | 7 | 11 | 15 | 3 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 36 | 101 | 5 | 16 | 6 | 15 | 13 | 8 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 37 | 101 | 15 | 1 | 14 | 2 | 3 | 13 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 38 | 101 | 13 | 2 | 7 | 8 | 12 | 3 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 39 | 101 | 4 | 16 | 7 | 13 | 11 | 9 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 40 | 101 | 2 | 16 | 11 | 7 | 4 | 14 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 41 | 101 | 16 | 2 | 11 | 7 | 8 | 10 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 42 | 101 | 1 | 15 | 11 | 5 | 2 | 14 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 43 | 101 | 16 | 3 | 8 | 11 | 15 | 4 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 44 | 101 | 3 | 16 | 10 | 9 | 6 | 13 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 45 | 101 | 2 | 14 | 9 | 7 | 4 | 12 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 46 | 101 | 2 | 14 | 10 | 6 | 3 | 13 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 47 | 101 | 16 | 2 | 15 | 3 | 4 | 14 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 48 | 101 | 3 | 16 | 4 | 15 | 13 | 6 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 49 | 101 | 5 | 16 | 12 | 9 | 6 | 15 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 50 | 101 | 3 | 15 | 5 | 13 | 11 | 7 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 51 | 101 | 16 | 5 | 10 | 11 | 15 | 6 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 52 | 101 | 16 | 1 | 7 | 10 | 14 | 3 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 53 | 101 | 2 | 15 | 4 | 13 | 11 | 6 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 54 | 101 | 1 | 16 | 13 | 4 | 2 | 15 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 55 | 101 | 16 | 2 | 8 | 10 | 14 | 4 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 56 | 101 | 2 | 16 | 12 | 6 | 3 | 15 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 57 | 101 | 15 | 3 | 13 | 5 | 6 | 12 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 58 | 101 | 2 | 16 | 6 | 12 | 10 | 8 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 59 | 101 | 16 | 2 | 12 | 6 | 7 | 11 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 60 | 101 | 16 | 5 | 13 | 8 | 9 | 12 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 61 | 101 | 15 | 2 | 14 | 3 | 4 | 13 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 62 | 101 | 16 | 1 | 6 | 11 | 15 | 2 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 63 | 101 | 14 | 2 | 7 | 9 | 13 | 3 | 0 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 1 |
|---|
| 64 | 101 | 1 | 15 | 3 | 13 | 11 | 5 | 1 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 0 |
|---|
| 65 | 101 | 15 | 2 | 13 | 4 | 5 | 12 | 0 | 0.4545 | 0.5152 | 0.5152 | 0.5152 | 0.4545 | 0.5152 | 0.5606 | 0.5606 | 0.5000 | 1 |
|---|
| 66 | 101 | 2 | 15 | 11 | 6 | 3 | 14 | 1 | 0.4545 | 0.3939 | 0.3939 | 0.3939 | 0.4545 | 0.3939 | 0.4394 | 0.4394 | 0.5000 | 0 |
|---|
| 67 | 102 | 16 | 5 | 14 | 7 | 8 | 13 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0514 | 0.0606 | 0.4242 | 0.0300 | 0.1364 | 0.5000 | 1 |
|---|
| 68 | 102 | 16 | 1 | 5 | 12 | 15 | 2 | 0 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 1 |
|---|
| 69 | 102 | 16 | 3 | 15 | 4 | 5 | 14 | 1 | 0.0606 | 0.4242 | 0.4242 | 0.1258 | 0.0606 | 0.4242 | 0.1306 | 0.1364 | 0.5000 | 1 |
|---|
| 70 | 102 | 5 | 16 | 11 | 10 | 7 | 14 | 0 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 0 |
|---|
| 71 | 102 | 15 | 2 | 8 | 9 | 13 | 4 | 0 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 1 |
|---|
| 72 | 102 | 5 | 16 | 7 | 14 | 12 | 9 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 5.9968e-05 | 0.1364 | 0.5000 | 0 |
|---|
| 73 | 102 | 1 | 15 | 12 | 4 | 2 | 14 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7877 | 0.7879 | 0.4242 | 0.9996 | 0.8636 | 0.5000 | 0 |
|---|
| 74 | 102 | 2 | 15 | 10 | 7 | 4 | 13 | 0 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 0 |
|---|
| 75 | 102 | 15 | 3 | 8 | 10 | 14 | 4 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 1 |
|---|
| 76 | 102 | 3 | 15 | 10 | 8 | 5 | 13 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 0 |
|---|
| 77 | 102 | 15 | 1 | 6 | 10 | 14 | 2 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 1 |
|---|
| 78 | 102 | 3 | 15 | 11 | 7 | 4 | 14 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 0 |
|---|
| 79 | 102 | 15 | 3 | 11 | 7 | 8 | 10 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0417 | 0.0606 | 0.4242 | 0.0094 | 0.1364 | 0.5000 | 1 |
|---|
| 80 | 102 | 14 | 2 | 13 | 3 | 4 | 12 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0934 | 0.0606 | 0.4242 | 0.0919 | 0.1364 | 0.5000 | 1 |
|---|
| 81 | 102 | 15 | 2 | 7 | 10 | 14 | 3 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 1 |
|---|
| 82 | 102 | 3 | 16 | 12 | 7 | 4 | 15 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 0 |
|---|
| 83 | 102 | 15 | 2 | 11 | 6 | 7 | 10 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0435 | 0.0606 | 0.4242 | 0.0138 | 0.1364 | 0.5000 | 1 |
|---|
| 84 | 102 | 2 | 16 | 5 | 13 | 11 | 7 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 8.9042e-05 | 0.1364 | 0.5000 | 0 |
|---|
| 85 | 102 | 2 | 16 | 4 | 14 | 12 | 6 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 1.9630e-04 | 0.1364 | 0.5000 | 0 |
|---|
| 86 | 102 | 1 | 16 | 6 | 11 | 9 | 8 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 2.7199e-05 | 0.1364 | 0.5000 | 0 |
|---|
| 87 | 102 | 3 | 16 | 5 | 14 | 12 | 7 | 1 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 1.3221e-04 | 0.1364 | 0.5000 | 0 |
|---|
| 88 | 102 | 16 | 2 | 13 | 5 | 6 | 12 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0594 | 0.0606 | 0.4242 | 0.0439 | 0.1364 | 0.5000 | 1 |
|---|
| 89 | 102 | 15 | 3 | 14 | 4 | 5 | 13 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0934 | 0.0606 | 0.4242 | 0.0919 | 0.1364 | 0.5000 | 1 |
|---|
| 90 | 102 | 3 | 16 | 11 | 8 | 5 | 14 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 0 |
|---|
| 91 | 102 | 3 | 15 | 4 | 14 | 12 | 6 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 1.9630e-04 | 0.1364 | 0.5000 | 0 |
|---|
| 92 | 102 | 3 | 16 | 6 | 13 | 11 | 8 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 5.9968e-05 | 0.1364 | 0.5000 | 0 |
|---|
| 93 | 102 | 16 | 3 | 9 | 10 | 14 | 5 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 1 |
|---|
| 94 | 102 | 15 | 2 | 12 | 5 | 6 | 11 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0514 | 0.0606 | 0.4242 | 0.0300 | 0.1364 | 0.5000 | 1 |
|---|
| 95 | 102 | 1 | 15 | 9 | 7 | 4 | 12 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 0 |
|---|
| 96 | 102 | 15 | 1 | 7 | 9 | 13 | 3 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7878 | 0.7879 | 0.4242 | 1.0000 | 0.8636 | 0.5000 | 1 |
|---|
| 97 | 102 | 2 | 16 | 13 | 5 | 3 | 15 | 1 | 0.7879 | 0.4242 | 0.4242 | 0.7877 | 0.7879 | 0.4242 | 0.9996 | 0.8636 | 0.5000 | 0 |
|---|
| 98 | 102 | 2 | 15 | 3 | 14 | 12 | 5 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0391 | 0.0606 | 0.4242 | 2.9146e-04 | 0.1364 | 0.5000 | 0 |
|---|
| 99 | 102 | 16 | 5 | 15 | 6 | 7 | 14 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0725 | 0.0606 | 0.4242 | 0.0638 | 0.1364 | 0.5000 | 1 |
|---|
| 100 | 102 | 15 | 3 | 12 | 6 | 7 | 11 | 0 | 0.0606 | 0.4242 | 0.4242 | 0.0465 | 0.0606 | 0.4242 | 0.0204 | 0.1364 | 0.5000 | 1 |
|---|
| ⋮ |
|---|
ric_model = fitglme(altTrialData, 'Chose1 ~ ad_Prob1 + a0greaterThanb0 + (1 | SubjectID)', 'Distribution', 'binomial');
ric_model.Rsquared
ans =
Ordinary: 0.6858
Adjusted: 0.6856
So there are no convergence warnings here (i.e. that we reached the maximum number of iterations - or attempts to find a solution - without satisfying the tolerance - or the criteria necessary to stop looking). Let's look at the R squared of the standard model for comparison.
model = fitglm(altTrialData, 'Chose1 ~ ad_Prob1', 'Distribution', 'binomial');
model.Rsquared
ans =
Ordinary: 0.4435
Adjusted: 0.4434
LLR: 0.3890
Deviance: 0.3890
AdjGeneralized: 0.5542
So there is a lot of unexplained variance in the standard model. Let's compare to the model with only the random intercept now:
ri_model = fitglme(altTrialData, 'Chose1 ~ ad_Prob1 + (1 | SubjectID)', 'Distribution', 'binomial');
ri_model.Rsquared
ans =
Ordinary: 0.6635
Adjusted: 0.6635
Wow, so the condition really only explains a very small amount of varance - there's a lot of indivdual variance that the model doesn't pick up. Let's compare this to the 3 norm model:
threeNormmodel = fitglm(trialData, 'Chose1 ~ Prob1', 'Distribution', 'binomial');
threeNormmodel.Rsquared
ans =
Ordinary: 0.4399
Adjusted: 0.4398
LLR: 0.3868
Deviance: 0.3868
AdjGeneralized: 0.5518
So somehow there's a lot of indivdual variance that our models just don't detect whatsoever. Let's see if we can trigger a convergence warning by including a random intercept of condition.
ris_model = fitglme(altTrialData, 'Chose1 ~ ad_Prob1 + a0greaterThanb0 + (1 + a0greaterThanb0 | SubjectID)', 'Distribution', 'binomial');
ris_model.Rsquared
ans =
Ordinary: 0.6891
Adjusted: 0.6890
No convergence warnings even here. Okay, let's take it onboard that our model is somehow missing 20% of the apparent variance in this behavior. We can proceed to fivefold validation
trialData.Prob1_ff = zeros(height(trialData), 1);
for i = 1:length(included_subjects)
df = grab_data(included_subjects(i));
df.Prob1 = zeros(height(df), 1);
order = randperm(height(df));
df.Pred = zeros(height(df), 1);
j = round((z - 1) * (height(df) / 5) + 1);
n = round(z * (height(df) / 5));
result_ff = optimize(@of_ad, initial_params([1, 2, 4:6]), lower_bounds([1, 2, 4:6]), upper_bounds([1, 2, 4:6]), df(~ismember(1:height(df), withheld), :));
pars = [result_ff(1:2), 0, result_ff(3:5)];
df.Prob1(withheld) = generatePredictions(pars, df(withheld, :));
Deviance_ff = -2 * sum(df.Chose1 .* log(df.Prob1) + (1 - df.Chose1) .* log(1 - df.Prob1));
fivefold(i, 1:27) = {included_subjects(i), Deviance_ff, ...
A_ff(1), A_ff(2), A_ff(3), A_ff(4), A_ff(5), ...
D_ff(1), D_ff(2), D_ff(3), D_ff(4), D_ff(5), ...
B_ff(1), B_ff(2), B_ff(3), B_ff(4), B_ff(5), ...
E_ff(1), E_ff(2), E_ff(3), E_ff(4), E_ff(5), ...
G_ff(1), G_ff(2), G_ff(3), G_ff(4), G_ff(5)};
trialData.Prob1_ff(trialData.SubjectID == included_subjects(i)) = df.Prob1;
fivefold.Properties.VariableNames = {'SubjectID', 'Deviance', ...
'A_F1', 'A_F2', 'A_F3', 'A_F4', 'A_F5', ...
'D_F1', 'D_F2', 'D_F3', 'D_F4', 'D_F5', ...
'B_F1', 'B_F2', 'B_F3', 'B_F4', 'B_F5', ...
'E_F1', 'E_F2', 'E_F3', 'E_F4', 'E_F5', ...
'G_F1', 'G_F2', 'G_F3', 'G_F4', 'G_F5'}
fivefold = 57×27 table
| | SubjectID | Deviance | A_F1 | A_F2 | A_F3 | A_F4 | A_F5 | D_F1 | D_F2 | D_F3 | D_F4 | D_F5 | B_F1 | B_F2 | B_F3 | B_F4 | B_F5 | E_F1 | E_F2 | E_F3 | E_F4 | E_F5 | G_F1 | G_F2 | G_F3 | G_F4 | G_F5 |
|---|
| 1 | 101 | 97.1880 | 0.4782 | 0.4811 | 0.4145 | 1.2103 | 0.2482 | 1.4307 | 1.3962 | 1.4619 | 0.7569 | 1.4869 | 6.3071 | 6.0420 | 5.9012 | 4.0285 | 4.2533 | 0.4764 | 0.4500 | 0.4038 | 0.5000 | 0.3964 | -0.0877 | -0.0556 | -0.1190 | -0.0094 | 0.0045 |
|---|
| 2 | 102 | 67.7072 | 1.1095 | 0.8451 | 1.5994 | 0.8706 | 1.1523 | 0.1646 | 0.1202 | 0.1064 | 0.1188 | 0.2034 | 2.6240 | 4.0487 | 7.4646 | 3.9569 | 3.3083 | 0.1347 | 0.1563 | 0.1259 | 0.1010 | 0.1092 | -0.5000 | -0.3297 | -0.3631 | -0.2338 | -0.2933 |
|---|
| 3 | 103 | 21.8285 | 1.5147 | 1.4924 | 1.4881 | 1.4735 | 1.4882 | 0.1452 | 0.1356 | 0.1368 | 0.1330 | 0.1355 | 6.6214 | 6.6789 | 6.6305 | 6.7560 | 6.6955 | 0.0186 | 4.8216e-05 | 0.0209 | 0.0186 | 0.0200 | -0.4981 | -0.0162 | -0.4984 | -0.4981 | -0.4984 |
|---|
| 4 | 104 | 21.9339 | 1.5941 | 1.5947 | 1.4771 | 1.6330 | 1.5938 | 0.1011 | 0.1020 | 0.1337 | 0.0745 | 0.0982 | 7.4917 | 7.5030 | 6.6784 | 6.3293 | 7.5542 | 0.0179 | 0.0193 | 4.8248e-05 | 0.0200 | 0.0186 | 0.4980 | 0.4982 | -0.0089 | 0.4983 | 0.4981 |
|---|
| 5 | 105 | 22.8919 | 0.4612 | 4.0252e-06 | 3.3869e-06 | 0.4791 | 6.5183e-06 | 1.6334 | 1.0324 | 1.0426 | 1.4377 | 0.1188 | 5.1374 | 0.6397 | 0.6538 | 6.5223 | 5.3681 | 0.0400 | 0.0219 | 0.0298 | 0.0228 | 1.3567e-05 | -0.4991 | -0.5000 | -0.5000 | -0.4985 | -0.3360 |
|---|
| 6 | 106 | 66.9448 | 1.6152 | 1.5539 | 1.6140 | 1.6016 | 1.5976 | 0.1003 | 0.0908 | 0.1093 | 0.0968 | 0.0987 | 7.6332 | 7.5344 | 7.5096 | 7.4398 | 7.6046 | 0.1896 | 0.1154 | 0.1538 | 0.1923 | 0.1141 | -0.1836 | -3.5427e-08 | -0.3750 | -0.2000 | -0.1494 |
|---|
| 7 | 107 | 75.0862 | 0.2688 | 0.3756 | 0.3344 | 0.2535 | 0.3125 | 1.6047 | 1.6857 | 1.5068 | 1.6019 | 1.5749 | 3.9619 | 4.2900 | 4.1267 | 4.1073 | 4.6118 | 0.2271 | 0.2607 | 0.2054 | 0.1893 | 0.2464 | -0.0283 | -0.1164 | -0.1957 | -0.0283 | -0.1243 |
|---|
| 8 | 108 | 93.2399 | 0.3016 | 1.9868 | 0.8220 | 0.8259 | 1.1661e-04 | 0.9049 | 1.3604 | 0.6750 | 0.6462 | 1.0194 | 0.0993 | 9.9382 | 1.7947 | 1.7891 | 0.1750 | 0.1323 | 0.3878 | 0.3550 | 0.3917 | 0.4661 | -0.4999 | -0.1668 | -0.2839 | -0.1939 | -0.1502 |
|---|
| 9 | 109 | 77.7570 | 1.7492 | 0.3673 | 0.2749 | 0.2087 | 1.2049e-07 | 1.7006 | 1.3090 | 1.6691 | 1.5390 | 0.0594 | 9.3596 | 6.1492 | 3.8712 | 4.4840 | 3.7288 | 0.4422 | 0.3292 | 0.3462 | 0.3264 | 0.2733 | -0.2234 | -0.3178 | -0.2778 | -0.3298 | -0.2838 |
|---|
| 10 | 110 | 26.9598 | 8.7831e-06 | 3.4212e-06 | 4.6848e-06 | 5.3349e-06 | 0.4314 | 1.0470 | 1.0347 | 1.0426 | 1.0428 | 1.5489 | 0.7270 | 0.6722 | 0.6744 | 0.7269 | 4.0412 | 0.0635 | 0.0538 | 0.0744 | 0.0516 | 0.0417 | -0.5000 | -0.5000 | -0.5000 | -0.5000 | -0.4992 |
|---|
| 11 | 111 | 0.0069 | 1.5029 | 1.4905 | 1.4874 | 1.4928 | 1.4922 | 0.1277 | 0.1352 | 0.1389 | 0.1399 | 0.1356 | 6.1319 | 6.6638 | 6.6806 | 6.6670 | 6.6815 | 4.7490e-05 | 4.7845e-05 | 4.8289e-05 | 4.7938e-05 | 4.7864e-05 | 0.0954 | -0.0182 | -0.0139 | 0.0012 | -0.0180 |
|---|
| 12 | 112 | 89.1837 | 1.1914 | 0.4065 | 1.5740 | 1.6007 | 0.6154 | 1.3615 | 0.8795 | 1.5888 | 1.6342 | 0.7423 | 7.3283 | 0.4012 | 9.6037 | 9.1971 | 0.4522 | 0.2747 | 0.2322 | 0.2560 | 0.2767 | 0.0976 | -0.0141 | -0.0816 | -0.1663 | -0.1917 | -0.1630 |
|---|
| 13 | 113 | 0.0068 | 1.4906 | 1.4905 | 1.4903 | 1.4890 | 1.4887 | 0.1395 | 0.1351 | 0.1389 | 0.1362 | 0.1339 | 6.6658 | 6.6642 | 6.6519 | 6.6426 | 6.6381 | 4.7844e-05 | 4.7751e-05 | 4.8447e-05 | 4.7911e-05 | 4.7794e-05 | -0.0067 | -2.9492e-04 | -0.0252 | -0.0123 | -0.0077 |
|---|
| 14 | 114 | 68.9381 | 1.0594 | 1.0612 | 1.0717 | 1.0767 | 1.0419 | 0.1775 | 0.1086 | 0.1039 | 0.1438 | 3.1824e-06 | 3.9065 | 1.5400 | 1.7983 | 1.9556 | 1.0145 | 0.1569 | 0.2009 | 0.1666 | 0.1865 | 0.1779 | -0.3593 | -0.2982 | -0.2821 | -0.2937 | -0.5000 |
|---|
| 15 | 115 | 40.6152 | 1.4978 | 1.4820 | 1.4818 | 1.4725 | 1.4789 | 0.1362 | 0.1394 | 0.1337 | 0.1331 | 0.1381 | 6.5665 | 6.4842 | 6.6231 | 6.5591 | 6.5663 | 0.1334 | 0.1667 | 0.1111 | 0.1305 | 0.1400 | -0.4997 | -0.4998 | -0.4997 | -0.4998 | -0.4998 |
|---|
| 16 | 116 | 31.7668 | 1.4966 | 1.4957 | 1.4899 | 1.4856 | 1.5333 | 0.1365 | 0.1314 | 0.1393 | 0.1370 | 0.1410 | 6.6436 | 6.3717 | 6.5456 | 6.5645 | 6.5403 | 0.0834 | 0.0926 | 0.0800 | 0.1000 | 0.0962 | -0.4996 | -0.4996 | -0.4996 | -0.4996 | -0.4997 |
|---|
| 17 | 117 | 82.4332 | 0.4378 | 0.2401 | 0.2470 | 1.2504 | 1.2197 | 1.5881 | 1.5238 | 1.5619 | 1.9999 | 1.9859 | 4.4440 | 4.2653 | 4.2184 | 9.9997 | 9.9355 | 0.2711 | 0.2530 | 0.1923 | 0.2792 | 0.2284 | -0.0902 | -0.0765 | -0.2000 | -0.0115 | 0.0472 |
|---|
| 18 | 118 | 61.2437 | 1.6063 | 1.5629 | 1.6249 | 1.5749 | 1.5730 | 0.0998 | 0.0984 | 0.1039 | 0.0896 | 0.1018 | 7.6117 | 7.4665 | 7.2526 | 7.3494 | 7.3561 | 0.2084 | 0.1867 | 0.2024 | 0.2381 | 0.1538 | -0.3345 | -0.3929 | -0.2941 | -0.3250 | -0.3750 |
|---|
| 19 | 119 | 22.1591 | 1.5163 | 1.4921 | 1.4849 | 1.4857 | 1.4923 | 0.1345 | 0.1357 | 0.1518 | 0.1370 | 0.1353 | 6.4528 | 6.7009 | 6.6186 | 6.8713 | 6.6778 | 0.0200 | 4.7720e-05 | 0.0173 | 0.0193 | 0.0209 | -0.4984 | 0.0011 | -0.4979 | -0.4983 | -0.4985 |
|---|
| 20 | 120 | 93.5268 | 0.5127 | 1.4516 | 0.8670 | 0.5215 | 1.2887 | 1.4810 | 1.9993 | 1.1172 | 1.4694 | 0.8020 | 4.0034 | 9.9965 | 4.1014 | 3.9969 | 6.0705 | 0.5000 | 0.4799 | 0.5000 | 0.5000 | 0.5000 | -0.0192 | -0.0849 | -0.0769 | -0.0962 | -0.1538 |
|---|
| 21 | 121 | 88.6196 | 0.9855 | 0.6243 | 1.5058 | 0.9408 | 1.0627 | 0.1609 | 1.3683 | 0.0800 | 1.1176 | 0.1522 | 0.3974 | 3.9935 | 6.7758 | 4.1189 | 1.7240 | 0.2201 | 0.5000 | 0.3155 | 0.5000 | 0.3120 | -0.4577 | -0.1346 | -0.1604 | -0.2115 | -0.3582 |
|---|
| 22 | 122 | 0.0071 | 1.4753 | 1.4910 | 1.4867 | 1.4870 | 1.4900 | 0.1337 | 0.1358 | 0.1359 | 0.1489 | 0.1352 | 6.7488 | 6.6994 | 6.6392 | 6.6618 | 6.6679 | 4.7765e-05 | 4.7729e-05 | 4.8605e-05 | 4.7861e-05 | 4.7751e-05 | -0.0094 | -0.0139 | -0.0286 | 0.0036 | -0.0066 |
|---|
| 23 | 124 | 86.4589 | 2.0000 | 1.0453 | 2.0000 | 0.9810 | 1.5995 | 0.5761 | 0.1638 | 0.3635 | 0.1902 | 0.0945 | 9.9999 | 1.6175 | 9.9999 | 4.2137 | 7.3508 | 0.3056 | 0.2518 | 0.2982 | 0.3013 | 0.2629 | -0.0456 | 0.0600 | -0.0200 | 0.0517 | 0.0246 |
|---|
| 24 | 125 | 23.6235 | 1.2072 | 1.0957 | 1.1133 | 1.1013 | 1.5930 | 0.0842 | 0.0945 | 0.1056 | 0.1077 | 0.1037 | 3.5874 | 1.7717 | 2.5622 | 2.3558 | 7.5027 | 0.0394 | 4.8474e-06 | 0.0292 | 0.0328 | 0.0186 | 0.4991 | -0.3480 | 0.4988 | 0.4992 | 0.4981 |
|---|
| 25 | 126 | 73.8646 | 0.9276 | 1.5760 | 1.1112 | 1.5636 | 1.5743 | 1.1216 | 0.0999 | 0.8560 | 0.0934 | 0.0954 | 4.0569 | 7.4095 | 4.1242 | 7.2248 | 7.2828 | 0.5000 | 0.4167 | 0.5000 | 0.3877 | 0.3681 | -0.2255 | -0.3000 | -0.2885 | -0.2937 | -0.3491 |
|---|
| 26 | 127 | 91.7783 | 1.2210 | 1.6104 | 1.5936 | 1.0910 | 1.6137 | 0.6805 | 0.1025 | 0.0904 | 0.8544 | 0.1009 | 4.0500 | 7.7714 | 7.5032 | 3.9911 | 7.6468 | 0.5000 | 0.3575 | 0.3654 | 0.5000 | 0.4289 | -0.0472 | -0.0697 | -0.1842 | -0.1038 | -0.0628 |
|---|
| 27 | 128 | 65.5510 | 1.4908 | 1.5022 | 1.4900 | 1.4915 | 1.4758 | 0.1353 | 0.1379 | 0.1413 | 0.1353 | 0.1350 | 6.6623 | 6.7091 | 6.7314 | 6.6831 | 6.5470 | 0.0536 | 0.0556 | 0.0625 | 4.7776e-05 | 0.0536 | -0.4994 | -0.4994 | -0.4995 | -0.0197 | -0.4994 |
|---|
| 28 | 129 | 91.6851 | 1.2844 | 1.2599 | 1.2852 | 0.2535 | 1.2613 | 2.0000 | 2.0000 | 2.0000 | 1.4257 | 2.0000 | 9.9999 | 9.9999 | 9.9999 | 4.9598 | 10.0000 | 0.3459 | 0.3304 | 0.3362 | 0.2926 | 0.3889 | -0.0545 | -0.1048 | -0.0743 | -0.1835 | -0.0707 |
|---|
| 29 | 130 | 21.8617 | 1.5004 | 1.4901 | 1.4918 | 1.4983 | 1.5137 | 0.1397 | 0.1357 | 0.1362 | 0.1406 | 0.1377 | 6.7015 | 6.7038 | 6.6698 | 6.6643 | 6.6641 | 0.0200 | 4.8635e-05 | 0.0179 | 0.0209 | 0.0193 | -0.4983 | -0.0050 | -0.4980 | -0.4984 | -0.4982 |
|---|
| 30 | 131 | 42.6282 | 1.6121 | 1.5915 | 1.5630 | 1.5916 | 1.6081 | 0.1009 | 0.0993 | 0.0862 | 0.1015 | 0.1012 | 7.6051 | 7.5000 | 7.5072 | 7.3970 | 7.5703 | 0.0957 | 0.0914 | 0.0962 | 0.0914 | 0.0757 | -0.1269 | -0.2812 | -0.1000 | -0.2812 | -0.0283 |
|---|
| 31 | 132 | 59.7471 | 0.3182 | 1.5791e-05 | 0.3031 | 0.4748 | 0.2581 | 1.6884 | 1.0312 | 1.5223 | 1.5428 | 1.4909 | 4.0137 | 0.7017 | 4.8275 | 6.5969 | 4.4305 | 0.1450 | 0.0816 | 0.0800 | 0.1489 | 0.1339 | -0.3621 | -0.3464 | -0.4996 | -0.3708 | -0.3617 |
|---|
| 32 | 133 | 85.3685 | 0.3826 | 0.5485 | 0.5299 | 0.5090 | 1.4639 | 0.8860 | 0.7965 | 0.8603 | 0.8237 | 1.9995 | 0.3932 | 0.4938 | 0.9636 | 0.3247 | 9.9977 | 0.2602 | 0.2663 | 0.3355 | 0.1804 | 0.3591 | -0.2117 | -0.4033 | -0.2280 | -0.5000 | -0.2153 |
|---|
| 33 | 134 | 90.5535 | 1.5747 | 1.0660 | 1.0310 | 0.6950 | 1.0459 | 0.0895 | 0.1845 | 0.2023 | 1.2844 | 0.1373 | 7.3402 | 1.8855 | 1.5330 | 3.9997 | 1.3785 | 0.3252 | 0.3770 | 0.3064 | 0.5000 | 0.3901 | -0.0695 | -0.2106 | -0.1822 | -0.0769 | -0.1387 |
|---|
| 34 | 135 | 93.3505 | 1.4044 | 1.5199 | 1.5193 | 1.2732 | 1.8612 | 2.0000 | 1.9988 | 1.9977 | 2.0000 | 1.9996 | 10.0000 | 9.9941 | 9.9889 | 9.9998 | 9.9982 | 0.3204 | 0.3361 | 0.3652 | 0.3317 | 0.3571 | -0.0920 | -0.1878 | -0.1346 | -0.1985 | -0.2000 |
|---|
| 35 | 136 | 55.5598 | 1.6077 | 1.5668 | 1.4696 | 1.5855 | 1.5944 | 0.1012 | 0.0925 | 0.1354 | 0.1041 | 0.0998 | 7.4530 | 7.4181 | 6.5493 | 7.5246 | 7.4839 | 0.1339 | 0.1093 | 0.0962 | 0.1147 | 0.0762 | -0.3617 | -0.3170 | -0.4996 | -0.3385 | -0.2570 |
|---|
| 36 | 138 | 84.0317 | 0.2412 | 0.4576 | 0.2344 | 0.2616 | 0.2397 | 0.6922 | 1.4281 | 1.5719 | 1.4697 | 1.4939 | 4.4287 | 6.6881 | 4.0390 | 4.5317 | 4.1948 | 0.4403 | 0.3364 | 0.3119 | 0.3452 | 0.3523 | -0.2380 | -0.0946 | -0.3312 | -0.2241 | -0.1452 |
|---|
| 37 | 140 | 96.0168 | 0.4821 | 0.2943 | 1.0051 | 0.9087 | 1.7132 | 1.4749 | 1.5533 | 1.2505 | 1.1456 | 2.0000 | 6.2952 | 4.1770 | 6.2699 | 5.4545 | 10.0000 | 0.4256 | 0.4048 | 0.3422 | 0.3098 | 0.3201 | 0.0385 | 0.0147 | 0.0179 | -0.0233 | -0.0371 |
|---|
| 38 | 141 | 91.3465 | 1.5959 | 1.5555 | 1.5170 | 0.6224 | 2.0000 | 1.6082 | 1.5538 | 1.5164 | 1.1730 | 1.7510 | 9.0382 | 8.8107 | 9.0324 | 4.1924 | 10.0000 | 0.3668 | 0.4688 | 0.4005 | 0.5000 | 0.4027 | -0.1632 | -0.0511 | -0.1072 | -0.0686 | -0.0776 |
|---|
| 39 | 142 | 98.5530 | 1.5810 | 1.5659 | 1.5344 | 0.6930 | 1.9987 | 0.0933 | 0.0943 | 0.0893 | 1.2839 | 1.4004 | 7.5324 | 7.3812 | 7.2323 | 3.9867 | 9.9934 | 0.3967 | 0.4145 | 0.4137 | 0.5000 | 0.4116 | -0.0332 | -0.0361 | -0.1475 | -0.0283 | -0.2004 |
|---|
| 40 | 143 | 86.4086 | 0.4776 | 1.5969 | 1.5396 | 1.0148 | 1.6214 | 1.5136 | 0.0935 | 0.0887 | 0.9540 | 0.0906 | 3.9862 | 7.4591 | 7.5123 | 4.1221 | 7.4984 | 0.5000 | 0.2692 | 0.2467 | 0.5000 | 0.2197 | -0.1154 | -0.2857 | -0.1757 | -0.1346 | -0.1207 |
|---|
| 41 | 144 | 21.5113 | 1.4838 | 1.4858 | 1.4974 | 1.4928 | 1.4921 | 0.1372 | 0.1393 | 0.1344 | 0.1341 | 0.1359 | 6.6345 | 6.6116 | 6.6086 | 6.6158 | 6.6494 | 0.0193 | 0.0179 | 0.0209 | 0.0209 | 4.9496e-05 | -0.4982 | -0.4980 | -0.4984 | -0.4984 | -0.0319 |
|---|
| 42 | 145 | 89.7992 | 0.6279 | 1.5407 | 2.0000 | 1.9997 | 1.8272 | 1.2021 | 1.5086 | 0.6097 | 1.8511 | 1.7656 | 5.3102 | 8.5852 | 9.9998 | 9.9987 | 9.5203 | 0.5000 | 0.3918 | 0.3365 | 0.4072 | 0.4254 | -0.1400 | -0.1666 | -0.2654 | -0.2368 | -0.2103 |
|---|
| 43 | 146 | 93.6328 | 0.5742 | 0.3459 | 2.0000 | 2.0000 | 2.0000 | 1.4195 | 1.0964 | 1.5484 | 1.6518 | 1.6486 | 4.0128 | 2.9493 | 9.9999 | 9.9999 | 9.9999 | 0.5000 | 0.4793 | 0.4007 | 0.4078 | 0.4269 | -0.1415 | -0.2079 | -0.1872 | -0.2242 | -0.2028 |
|---|
| 44 | 147 | 0.0073 | 0.4515 | 0.4663 | 0.5652 | 0.4783 | 0.6312 | 1.3664 | 1.3579 | 1.3356 | 1.3447 | 1.4665 | 4.1947 | 4.1123 | 5.8132 | 4.1445 | 5.3348 | 4.8934e-05 | 5.6209e-05 | 5.4811e-05 | 4.8731e-05 | 5.4675e-05 | -0.0055 | 0.0365 | -0.0047 | 9.4005e-04 | 0.0123 |
|---|
| 45 | 148 | 34.1181 | 1.4920 | 1.5059 | 1.4829 | 1.5587 | 1.4745 | 0.1382 | 0.1337 | 0.1343 | 0.1493 | 0.1301 | 6.7043 | 6.5376 | 6.5376 | 6.6335 | 6.5564 | 0.0834 | 0.0600 | 0.1154 | 0.0968 | 0.0962 | -0.4996 | -0.4995 | -0.4997 | -0.4996 | -0.4997 |
|---|
| 46 | 149 | 22.6940 | 2.0000 | 1.9997 | 1.9997 | 1.9995 | 1.4898 | 0.3812 | 0.3635 | 0.3635 | 0.3635 | 0.1388 | 9.9999 | 9.9986 | 9.9986 | 9.9973 | 6.6615 | 3.2270e-06 | 3.0099e-05 | 2.9809e-05 | 4.2908e-05 | 4.7838e-05 | -0.0168 | -0.3240 | -0.3212 | -0.2759 | 0.0060 |
|---|
| 47 | 150 | 80.6691 | 0.5798 | 0.5230 | 0.1223 | 0.3117 | 1.4046 | 0.9900 | 0.8480 | 0.9945 | 1.4075 | 2.0000 | 2.8508 | 3.9132 | 0.1522 | 4.0551 | 9.9999 | 0.2989 | 0.2881 | 0.0731 | 0.3359 | 0.3198 | -0.1814 | -0.2926 | -0.5000 | -0.1617 | -0.2010 |
|---|
| 48 | 151 | 63.0235 | 1.5817 | 1.5693 | 1.6390 | 1.5880 | 1.5885 | 0.0851 | 0.0947 | 0.0903 | 0.0932 | 0.0987 | 7.3271 | 7.5587 | 7.3767 | 7.5200 | 7.5251 | 0.2197 | 0.2319 | 0.2352 | 0.2082 | 0.2121 | -0.1207 | -0.2187 | -0.0951 | -0.1725 | -0.2857 |
|---|
| 49 | 152 | 67.9910 | 1.6319 | 1.0251 | 1.0526 | 1.0916 | 1.0577 | 0.0907 | 0.0139 | 0.0738 | 1.6190e-06 | 4.8798e-06 | 7.4082 | 0.8338 | 1.1882 | 1.0874 | 1.0527 | 0.1887 | 0.1451 | 0.1435 | 0.2115 | 0.1896 | -0.0094 | -0.1382 | -0.1618 | -0.1102 | -0.0828 |
|---|
| 50 | 153 | 101.9017 | 0.2999 | 0.3475 | 0.3507 | 0.3730 | 1.8361 | 1.5610 | 1.6243 | 1.4413 | 1.5928 | 1.9985 | 4.4382 | 4.0062 | 4.1418 | 4.6515 | 9.9924 | 0.3768 | 0.4131 | 0.4422 | 0.3521 | 0.4870 | -0.0406 | 0.1276 | 0.0025 | 0.1592 | 0.0436 |
|---|
| 51 | 154 | 42.2913 | 1.5975 | 1.5054 | 1.6039 | 1.5978 | 1.6022 | 0.0953 | 0.1351 | 0.1061 | 0.1021 | 0.1000 | 7.3475 | 6.6170 | 7.5307 | 7.5005 | 7.5803 | 0.0377 | 0.0186 | 0.0387 | 0.0179 | 0.0381 | 0.0094 | -0.4982 | -0.0385 | 0.4980 | 0.0472 |
|---|
| 52 | 155 | 49.1139 | 2.0000 | 2.0000 | 2.0000 | 1.5911 | 2.0000 | 0.3788 | 0.3627 | 0.3624 | 0.1023 | 0.3618 | 10.0000 | 10.0000 | 10.0000 | 7.4850 | 9.9998 | 0.0337 | 0.0573 | 0.0603 | 0.0377 | 0.0414 | 0.0504 | 0.1373 | 0.1820 | -0.0094 | 0.4994 |
|---|
| 53 | 156 | 69.6066 | 0.4997 | 0.4980 | 0.4844 | 0.4700 | 0.4542 | 0.9635 | 0.9625 | 0.9479 | 1.4434 | 0.9238 | 2.2276 | 2.2272 | 1.8983 | 6.5570 | 1.7022 | 0.0411 | 0.0382 | 0.0361 | 0.0371 | 0.0019 | -0.3919 | -0.3501 | -0.4648 | -0.4991 | 0.4999 |
|---|
| 54 | 157 | 116.6987 | 1.0125 | 2.0000 | 1.9910 | 1.9983 | 2.0000 | 1.9405e-06 | 0.7502 | 1.3476 | 1.3737 | 0.7508 | 0.2868 | 10.0000 | 9.9602 | 9.9916 | 10.0000 | 0.1600 | 0.2869 | 0.2647 | 0.2300 | 0.3318 | -0.5000 | -0.1365 | -0.3158 | -0.5000 | -0.1669 |
|---|
| 55 | 158 | 92.9659 | 0.6404 | 1.2778 | 0.2527 | 1.2454 | 1.2904 | 1.2054 | 2.0000 | 1.5301 | 0.7841 | 2.0000 | 4.0910 | 9.9998 | 4.1191 | 3.9834 | 9.9999 | 0.5000 | 0.4462 | 0.4375 | 0.5000 | 0.4643 | -0.0660 | -0.0789 | -0.0714 | -0.0660 | -0.0572 |
|---|
| 56 | 160 | 59.1946 | 0.4923 | 0.3168 | 0.4957 | 0.2625 | 0.3483 | 1.4553 | 1.5504 | 1.4499 | 1.3884 | 1.5504 | 6.7061 | 4.7091 | 6.3644 | 5.0101 | 4.8179 | 0.1556 | 0.1731 | 0.1711 | 0.1156 | 0.1526 | 0.1429 | 0.0556 | 0.1494 | -0.0192 | 0.1068 |
|---|
| 57 | 161 | 90.5272 | 0.4890 | 1.3000 | 1.4186 | 0.3344 | 0.2593 | 1.4212 | 0.4209 | 0.4246 | 1.6068 | 1.5453 | 6.4144 | 4.0970 | 4.0726 | 4.5608 | 4.2288 | 0.3846 | 0.5000 | 0.5000 | 0.3720 | 0.3958 | -0.2000 | -0.1923 | -0.2115 | -0.2200 | -0.1316 |
|---|
Now we can check the model accuracy
sum(trialData.Chose1 == round(trialData.Prob1_ff))/height(trialData)
That's not such a bad drop in model accuracy - only around 2%. Now let's test it agains the normally recovered model:
fivefold.BIC = fivefold.Deviance + log(65) * 6;
[~, p, ~, stats] = ttest(fivefold.BIC, altSubjectData.BIC_M4); stats.p = round(p, 3)
stats =
tstat: 11.0559
df: 56
sd: 9.8826
p: 0
So it's significantly better, but that's not surprising and not concerning given the small decrease in accuracy. Let's look at the similarlity of the parameters - first, let's compute cosine similarity.
function cos = cosine(A, B)
cos = dot(A, B) / (norm(A) * norm(B));
cosines(i) = cosine(altSubjectData.Alpha_M4, fivefold{:, i + 2});
cosines(i + 5) = cosine(altSubjectData.Delta_M4, fivefold{:, i + 7});
cosines(i + 10) = cosine(altSubjectData.Beta_M4, fivefold{:, i + 12});
cosines(i + 15) = cosine(altSubjectData.Epsilon_M4, fivefold{:, i + 17});
cosines(i + 20) = cosine(altSubjectData.Gamma_M4, fivefold{:, i + 22});
Now let's see the cosine for all parameters - first alpha:
Very good. Now delta
Again, very good. Now beta
Also good. Now epsilon
Nice. Finally gamma
Everything looks good here. Let's go to the fun part - model testing.
3.1 Compare Models
First, let's test the alpha + delta model agains the full model (which has alpha + delta + rho)
[~, p, ~, stats] = ttest(altSubjectData.BIC_M4, subjectData.BIC); stats.p = round(p, 3)
stats =
tstat: -13.4929
df: 56
sd: 2.4684
p: 0
So M4 has a significantly lower BIC that the full model (i.e. p < 0.001), meaning it is significantly better to ignore rank-reversal. How about the alpha only model (M1)
[~, p, ~, stats] = ttest(altSubjectData.BIC_M4, altSubjectData.BIC_M1); stats.p = round(p, 3)
stats =
tstat: -1.5722
df: 56
sd: 20.1206
p: 0.1220
So M4 is better, but not significantly better meaning that including harm aversion does not significanlty improve model performance. How about the delta only model (M2)
[~, p, ~, stats] = ttest(altSubjectData.BIC_M4, altSubjectData.BIC_M2); stats.p = round(p, 3)
stats =
tstat: -5.2137
df: 56
sd: 33.6938
p: 0
So including inequality aversion significantly improves model performance. How about the rho only model (M3)
[~, p, ~, stats] = ttest(altSubjectData.BIC_M4, altSubjectData.BIC_M3); stats.p = round(p, 3)
stats =
tstat: -5.2403
df: 56
sd: 33.6367
p: 0
Including both inequality aversion and harm aversion significantly improves model performance. Let's look at the inequality and rank reversal aversion model (M5)
[~, p, ~, stats] = ttest(altSubjectData.BIC_M4, altSubjectData.BIC_M5); stats.p = round(p, 3)
stats =
tstat: -1.7419
df: 56
sd: 2.3230
p: 0.0870
Switching harm aversion for rank reversal aversion does not significanlty lessen model performance. Interesting - now we can examine the harm and rank reversal aversion model (M6)
[~, p, ~, stats] = ttest(altSubjectData.BIC_M4, altSubjectData.BIC_M6); stats.p = round(p, 3)
stats =
tstat: -6.1536
df: 56
sd: 33.6839
p: 0
Switching inequality aversion for rank reversal aversion significantly lessens model performance. How about if we remove the noise and bias parameters from the full model
[~, p, ~, stats] = ttest(altSubjectData.BIC_M4, altSubjectData.BIC_M7); stats.p = round(p, 3)
stats =
tstat: -3.5271
df: 56
sd: 100.1979
p: 1.0000e-03
So it does significantly hurt the model performance compared to our favored model - but let's compare apples to apples here:
[~, p, ~, stats] = ttest(subjectData.BIC, altSubjectData.BIC_M7); stats.p = round(p, 3)
stats =
tstat: -3.1910
df: 56
sd: 100.3137
p: 0.0020
So including these significantly improves model performance. We may wish to verify by instantiating this model as a variant of M4 instead of the full model, but at the moment it's not extremely urgent to do. Let's move onto M8 which only removes the bias parameter
[~, p, ~, stats] = ttest(subjectData.BIC, altSubjectData.BIC_M8); stats.p = round(p, 3)
stats =
tstat: 1.3248
df: 56
sd: 3.8013
p: 0.1910
It seems that including the bias parameter might not be extremely important, so we again may wish to rerun M8 as a variant of M4. But that is for another time. Let's finish up with M9 and that only has left right preferences
[~, p, ~, stats] = ttest(altSubjectData.BIC_M4, altSubjectData.BIC_M9); stats.p = round(p, 3); stats
stats =
tstat: -4.9360
df: 56
sd: 31.8217
p: 0
So that's significant, but not as big as we might expect.
Here, we can conclude that inequality aversion plays a significant role in these decisions, but harm aversion does not. Also, these decisions tend to be very noisy, but not significantly determined by biases for chosing left versus right.
3.3 Test for Individual Differences
Let’s first recover parameters over the whole dataset and we can assess how accurate it is
resultNID = optimize(@of_ad, initial_params([1,2,4:6]), lower_bounds([1,2,4:6]), upper_bounds([1,2,4:6]), trialData);
pars = [resultNID(1:2), 0, resultNID(3:5)];
trialData.Prob1_NID = generatePredictions(pars, trialData);
sum(trialData.Chose1 == round(trialData.Prob1_NID))/height(trialData)
Not very good. This isn’t surprising given that people often have very different preferences. Now let’s test for individual differences.
altSubjectData.Deviance_NID = zeros(size(altSubjectData,1), 1);
for i = 1:length(included_subjects)
trials = find(included_subjects(i) == trialData.SubjectID);
df = trialData(trials, :);
altSubjectData.Deviance_NID(i) = -2 * sum(df.Chose1 .* log(df.Prob1_NID) + (1 - df.Chose1) .* log(1 - df.Prob1_NID));
altSubjectData.BIC_NID = altSubjectData.Deviance_NID + log(65) * 5 / length(included_subjects);
[~,p,~,stats] = ttest(altSubjectData.BIC_M4, altSubjectData.BIC_NID); stats.p = round(p, 3); stats
stats =
tstat: -3.6358
df: 56
sd: 30.0943
p: 1.0000e-03
Significant individual differences. Let’s see which models are worse
find(modelBIC > sum(altSubjectData.BIC_NID))
So M2 (delta only), M3 (rho only), M6 (delta + rho), M7 (no epsilon or gamma), and M9 (only left/right) are worse than this model